You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When training a LoRA , the VRAM usage is highly dependent on the image resolution. For instance, with an image resolution of 4096x4096, the VRAM requirement can reach approximately 60 GB on A100.
For the cat example I provided, training took around 5,000 iterations.To optimize the process, you may first train the model at a lower base resolution (such as 1024x1024) and then fine-tune it on higher resolutions.
HI, thanx for excelent solution and resolution!
Plase give more information about training LORA:
The text was updated successfully, but these errors were encountered: