-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added checks for layers to transforms and layer pattern in lora #2159
Conversation
Thanks for working on this PR so quickly. Could you please propagate this check to the other config classes that support this too? Also, it would be great to add a test to Apart from this fix, I think we can also improve the error reporting when users pass peft/src/peft/tuners/tuners_utils.py Lines 512 to 515 in 57a452a
As the user reported, they got the error
But the real reason was how they defined |
Sure, i will work through this one |
ive added a test case and extended to other classes. I noticed that one unrelated testcase was failing:
Should we try and fix this? |
ouuu, il raise another PR to handle this second part |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
Could you please run |
Hi @BenjaminBossan done! |
@JINO-ROHIT Oh I see there is an error there. It is actually possible to have peft/src/peft/tuners/tuners_utils.py Lines 981 to 982 in e8259ff
This is why a bunch of tests are failing. I didn't know that (or forgot), so this is my bad. I think we can still do the reverse though, i.e. if |
yeah missed this too, did you mean to do something like this? |
Ah no, I don't think that's quite right. You added What I mean is that the check if self.layers_to_transform is not None and self.layers_pattern is None:
raise ValueError("When `layers_to_transform` is specified, `layers_pattern` must also be specified. ") should be changed to if self.layers_pattern is not None and self.layers_to_transform is None:
raise ValueError("When `layers_pattern` is specified, `layers_to_transform` must also be specified. ") I also like to add some parentheses in these cases for readability:
|
yeah, now i get why the testcase werent passing after seeing the other test config files, ive made the reverse check and moved the check down so now i think all the testcases should pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As you can see, some tests in the CI are failing. This is actually expected, as some of them just don't make sense anymore, like this one:
peft/tests/test_tuners_utils.py
Line 97 in 0d58942
("foo.bar.7.baz", ["baz"], None, ["bar"], True), |
While checking this, I also noticed some room for improvement in the checking logic, please take a look at my comment.
I think this needs to be changed and the tests adjusted a bit, like I mentioned. You can call pytest tests/test_tuners_utils.py
locally to ensure that all relevant tests pass.
src/peft/tuners/adalora/config.py
Outdated
if isinstance(self.target_modules, str) and self.layers_pattern is not None: | ||
raise ValueError("`layers_pattern` cannot be used when `target_modules` is a str.") | ||
# check for layers_to_transform and layers_pattern | ||
if (self.layers_pattern is not None) and (self.layers_to_transform is None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think these checks should be adjusted like so:
if (self.layers_pattern is not None) and (self.layers_to_transform is None): | |
if self.layers_pattern and not self.layers_to_transform: |
Why?
- If users pass
layers_pattern=[], layers_to_transform=None
, we should not raise. - If users pass
layers_pattern=["foo"], layers_to_transform=[]
, we should raise
Right now, we don't do that because of the strict None
check.
Sure, im away for a couple days on a conference, ill be back and fix this :) |
Thanks @JINO-ROHIT. It's not urgent, so take your time, enjoy the conference. |
Thank you very much @BenjaminBossan , im back . the checks now are making a lot more sense, ive fixed them and run tests. |
Hi @BenjaminBossan any clue why this is failing? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this check to the configs.
This PR is aimed to address point 3 in #2155 to adjust the error message when layers transforms and layer patterns are specified.
Tbh, I dont quite understand the functionality of layern_patterns but assuming that we need layers_pattern when layers_transform is applied, ive added this simple check.