-
Notifications
You must be signed in to change notification settings - Fork 215
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What are the minimum ram requirements? #10
Comments
Ah, my bad, I should put this in the I've provided an option here to exchange GPU RAM with RAM, uncomment this line will first load the models to RAM and then use GPU RAM only when needed! |
The reason why this project requires some more GPU RAM than the SD vanilla is that - It actually integrates FOUR different SD versions together, and many other models as well 🤣.
|
wow so cool! It seems to be loaded now! Thanks for the help! I'm using the OPT because I do want to see the features together, especially all the img2img-related features. |
That's great 🥳!
|
@carefree0910 I have an 8GB GPU (RTX2070) & 16 GB RAM. At launch with '--lazy' argument, I have 12.3 GB RAM available and 7.5 GB GPU ram. GPU ram increases to around 6500 MG used (as reported by NVIDIA Inspector) and I then get:
There is minimal usage of CPU RAM during this process. Automatic1111 with several extensions runs fine. Any suggestions as to why it seems CPU RAM isn't been used? T |
@aleph23 Hi! This project has one major difference from the Automatic1111: it launches MANY models at the same time, so it will eat up much more resources. There is a workaround though: cfcreator serve --limit 1 Which means you'll only load
|
*how to run this bro :) |
The Goole Colab should be working now! |
I have an 8 Gig GPU... I suspect it's not enough because I run into
I was able to run stable diffusion vanilla--does this require additional GPU ram?
The text was updated successfully, but these errors were encountered: