Spaces:
Running
on
Zero
Local Windows Implementation
Hi, will you release a local implementation guide for windows? Managed to get this running locally but the results seem to be all over the place and heavily compressed.
Hi, will you release a local implementation guide for windows? Managed to get this running locally but the results seem to be all over the place and heavily compressed.
Is it possible to run this model locally?
Yes, this space is fully Open Source and can be run locally.
It is highly unlikely that we will release documentation for Windows though as nobody on our team runs it. It can be run on a Linux machine with a GPU with enough RAM (for instance a 3090 or a 4090 should work).
Yes, this space is fully Open Source and can be run locally.
It is highly unlikely that we will release documentation for Windows though as nobody on our team runs it. It can be run on a Linux machine with a GPU with enough RAM (for instance a 3090 or a 4090 should work).
My graphics card model is geforce gtx 960M
Is it possible to run this model for me?
This graphics card has only 4 GB of VRAM so I do not think it can run, at least not without some changes.
This graphics card has only 4 GB of VRAM so I do not think it can run, at least not without some changes.
If it is possible, optimize it to run on Google Colab and put the script