-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
With tied embeddings adapter merged to tied layers #2018
Comments
Thanks for opening this issue. Yes, I agree that this is an easy source of errors, and having a warning would help. The main reason why this is not implemented yet is that merging is a layer-level operation in PEFT. The individual layer can, however, not know if its weights are tied are not. Therefore, we cannot easily check for this. It could be possible to refactor this to work differently but I don't see an easy way. We could still try to make an educated guess based on
This can also be achieved by passing |
Yes sure, happy to have a go at it later this week! |
Fantastic, thanks. Don't hesitate to ask me if something is unclear, or to create a draft PR for early feedback. |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. |
Resolved via #2025. |
System Info
peft=0.12.0
transformers =4.44.0
Who can help?
No response
Information
Tasks
examples
folderReproduction
With Gemma2, a model where
tie_word_embeddings = True
, usingtarget_modules=["lm_head"]
and merging the adapter leads to merging the adapter to the tied/embedding layer, which is incorrect.Expected behavior
I think that merging should not succeed silently, but either a:
Related issues
The text was updated successfully, but these errors were encountered: