Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[enh] Add Support for multiple adapters on Transformers-based models #3046

Merged
merged 5 commits into from
Nov 8, 2024

Conversation

tomaarsen
Copy link
Collaborator

@tomaarsen tomaarsen commented Nov 8, 2024

Supersedes #2993

Hello!

Details

Builds on top of #2993. I don't have permission to push directly into that PR as it's based on https://github.com/nuclia/sentence-transformers/tree/master to which I have no permission. So, I'm creating this PR to sidestep those issues.

cc @carlesonielfa

  • Tom Aarsen

@tomaarsen
Copy link
Collaborator Author

tomaarsen commented Nov 8, 2024

@carlesonielfa
I've uploaded https://huggingface.co/sentence-transformers-testing/stsb-bert-tiny-lora (it includes a training script as well). The model card also shows that training with LoRA works!
I'm going to hustle to get this included in the next release, but that does mean that it won't have extensive documentation on launch. Ideally, I'd like to host a Training with PEFT documentation page on sbert.net, but that will have to come later.

P.s. if you're interested, you're free to help me get that set up! I'm not suuper familiar with PEFT.

cc @bloodbare for the PR

  • Tom Aarsen

@tomaarsen tomaarsen merged commit 7ede83b into UKPLab:master Nov 8, 2024
9 checks passed
@carlesonielfa
Copy link
Contributor

carlesonielfa commented Nov 8, 2024

Great! Thank you so much for taking a look!

As for helping with the documentation, I’ll consider it! However, I’m not sure if I’ll have the time available.

@tomaarsen tomaarsen deleted the feat/peft_adapter_mixin branch November 8, 2024 15:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants