-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Examples] use loralinear instead of depecrecated lora attn procs. #5331
Conversation
@patrickvonplaten did you envision something like this to get rid of the LoRA attention processors from the training examples? If okay I will propagate the changes to the rest of the scripts. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool that works for me!
@patrickvonplaten could you give this another look? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Works for me!
The documentation is not available anymore as the PR was closed or merged. |
…uggingface#5331) * use loralinear instead of depecrecated lora attn procs. * fix parameters() * fix saving * add back support for add kv proj. * fix: param accumul,ation. * propagate the changes.
…uggingface#5331) * use loralinear instead of depecrecated lora attn procs. * fix parameters() * fix saving * add back support for add kv proj. * fix: param accumul,ation. * propagate the changes.
Fixes: #5133
Colab: https://colab.research.google.com/gist/sayakpaul/4e00a485de92a68b2d5411f090493081/scratchpad.ipynb
WandB: https://wandb.ai/xin-sayak/dreambooth-lora/runs/drwoiw6j
Weights on the Hub: https://huggingface.co/sayakpaul/new-lora-check-v15