Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Examples] use loralinear instead of depecrecated lora attn procs. #5331

Merged
merged 8 commits into from
Oct 11, 2023

Conversation

@sayakpaul
Copy link
Member Author

@patrickvonplaten did you envision something like this to get rid of the LoRA attention processors from the training examples? If okay I will propagate the changes to the rest of the scripts.

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool that works for me!

@sayakpaul sayakpaul marked this pull request as ready for review October 10, 2023 14:41
@sayakpaul
Copy link
Member Author

@patrickvonplaten could you give this another look?

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works for me!

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Oct 11, 2023

The documentation is not available anymore as the PR was closed or merged.

@sayakpaul sayakpaul merged commit 0fa32bd into main Oct 11, 2023
13 checks passed
@sayakpaul sayakpaul deleted the graduate-lora-training branch October 11, 2023 11:02
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
…uggingface#5331)

* use loralinear instead of depecrecated lora attn procs.

* fix parameters()

* fix saving

* add back support for add kv proj.

* fix: param accumul,ation.

* propagate the changes.
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
…uggingface#5331)

* use loralinear instead of depecrecated lora attn procs.

* fix parameters()

* fix saving

* add back support for add kv proj.

* fix: param accumul,ation.

* propagate the changes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

LoRAAttnProcessor2_0 getting deprecated and consequences for Dreambooth training
3 participants