-
Notifications
You must be signed in to change notification settings - Fork 921
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Example lora_config.yaml not up to date #1067
Comments
|
Ok. Thank you. Should I create a PR for the example_config.yaml or do you want to do that? That's a good point. I didn't realize it was intended to include metadata. I always created an extra json file with additional metadata. I just realized by coincidence that I still had the old lora_layer-parameters in the config. Maybe a deprecation warning would have helped me to recognize earlier that the parameters had also changed in the yaml. But that's not meant to be a criticism. I am grateful for your work on MLX LM and think you are doing a great job. :) |
Thanks for the fix! For the future a deprecation warning is the right call.. we'll be more careful there. |
When updating the config yaml for the addition of full parameter tuning, the
lora_layers
parameter was not adjusted tonum_layers
.mlx-examples/llms/mlx_lm/examples/lora_config.yaml
Line 17 in 9000e28
However, if the config is used, both values are later found in adapter_config.json. I assume that only the num_layers value is valid and lora_layers is only copied? I probably have to fine-tune my model again if I want to use the correct number of layers?
Maybe it also makes sense to include a check that only valid values are included in addition to updating the lora_config.yaml?
Are there any other values that I should update in the config? I have seen that use_dora has also been dropped?
The text was updated successfully, but these errors were encountered: