-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🦒 colab #9
Comments
@camenduru You really did a great job, I tried doing it myself before checking the issues tab, my colab works but somehow when running a generation the whole 15gb of VRAM gets eaten up real fast and the colab restarts after exhausting all the available RAM, I have seen this same problem with all the models from huggingface lately (example https://discuss.huggingface.co/t/colab-ram-limit-exceeded-unable-to-run-3b-model-even-with-quantization/50022, have a look if you can please), I don't know how you get these model to run, but anyway thank you. |
thanks! added official version |
It does not run successfully every time. If the panel address in the fourth step does not appear, you need to re-run the colab file. If it fails many times, please confirm that the t4 gpu and other account permissions are enabled. Don't pull the warehouse from github, run it directly, you can make a copy of the configuration file, but don't pull it, I don't know why, but when I do this, the file will fail to run. Thanks again to the author of fooocus and the author of the colab file, thank you! ! |
@lianghongli2 hi you are using the old colab - if you use the new official one on the Readme, this problem is already solved |
Let's close this issue and continue with the official colab 🥳 |
sure. great works! |
Thanks for the project ❤️ I made a colab. 🥳 I hope you like it. https://github.com/camenduru/Fooocus-colab
The text was updated successfully, but these errors were encountered: