Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed multirank + multialpha for sequential LoRAs, added correct support of LoRA-C3Lier conversion #937

Conversation

kovalexal
Copy link
Contributor

@pacman100 Hi!

As discussed in #873, here are the modifications I've done to fix the issue I had mentioned.

Also, I would like to add proper support for special type of LoRAs for StableDiffusion - LoRA-C3Lier - this is just a LoRA with different rank and alpha for conv2d layers, so I refactored the conversion script.
The new conversion script works better - it automatically detects whether the provided LoRA contains conv2d (1x1, 3x3) layers or not and provides custom ranks / alphas for these layers if needed.
This script can be easily modified for proper support of SDXL LoRAs, so I was going to add it in separate PR if you are comfortable with it.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@BenjaminBossan
Copy link
Member

Thanks @kovalexal. Could you please run make style?

@kovalexal
Copy link
Contributor Author

@BenjaminBossan Hi!

Sorry for that, fixed styling.

Copy link
Contributor

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @kovalexal for all the work on support multirank and multialpha LoRA layers. Left a comment

@kovalexal
Copy link
Contributor Author

@pacman100 seems strange that this test is failing on Windows. What diffusers version is used for running tests?
As far as I see, diffusers have changed the signature to LoRACompatibleConv - earlier it accepted only a single argument, but now they've added a second argument - scale - probably the error in the test is related to it.

@BenjaminBossan
Copy link
Member

BenjaminBossan commented Sep 18, 2023

That could be the explanation. We're working on a proper fix, in the meantime we have fixed the diffusers version in #936. So probably this branch needs to merge main for the test to pass.

Edit: Btw. the test failing is probably unrelated to Windows, it just happened that Windows was the first CI run to fail, which cancelled all other runs.

@kovalexal
Copy link
Contributor Author

@BenjaminBossan ok, got it.

Should I merge main into this branch or we will do it in parent PR #873?

@BenjaminBossan
Copy link
Member

If @pacman100 is fine with the PR as is, I'd suggest we merge it and do a full on deep review on his PR once it's finished.

@pacman100
Copy link
Contributor

Hello, Thank you @kovalexal 🤗! Merging it.

@pacman100 pacman100 merged commit 2433abb into huggingface:smangrul/lora-supoort-multirank-multialpha Sep 18, 2023
pacman100 added a commit that referenced this pull request Sep 22, 2023
* support multiple ranks and alphas

* Update lora.py

* Update lora.py

* commit suggestions

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* address comments

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* Fixed multirank + multialpha for sequential LoRAs, added correct support of LoRA-C3Lier conversion (#937)

* Fixed multirank multialpha for sequential loras, added tests, fixed docs

* Refactored kohya_ss conversion script for proper support of LoRA-C3Lier

* Fixed styling

* Removed old comment from docstring

* shift `scale_layer`/`unscale_layer` to `LoraLayer` class to support all the child classes

* support multiple active adapters

* add `active_adapters` property

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

* fix bug related to active adapter of `ModulesToSaveWrapper`

* revert the change wrt active_adapter assignment

Co-Authored-By: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>

* addressing comments

* address comments

* address comment

---------

Co-authored-by: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>
Co-authored-by: Alexander Kovalchuk <kovalexal@gmail.com>
Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants