So, it's possible to train a Flux Lora with a 4070 TI Super 16gb with this tool? #931
Replies: 1 comment 1 reply
-
i'm mostly curious why you would want to when a 24G GPU is $0.15/hr on Vast or $0.22/hr on RunPod community cloud? a 5 hour training run will work better there and cost ~$0.75 to $1.00, right? what's the gain of running a low batch size on 16G and training a lower quality model? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Or only possible with kohya_ss?
Thx.
Beta Was this translation helpful? Give feedback.
All reactions