Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example lora_config.yaml not up to date #1067

Closed
hschaeufler opened this issue Oct 23, 2024 · 3 comments
Closed

Example lora_config.yaml not up to date #1067

hschaeufler opened this issue Oct 23, 2024 · 3 comments

Comments

@hschaeufler
Copy link
Contributor

hschaeufler commented Oct 23, 2024

When updating the config yaml for the addition of full parameter tuning, the lora_layers parameter was not adjusted to num_layers.

However, if the config is used, both values are later found in adapter_config.json. I assume that only the num_layers value is valid and lora_layers is only copied? I probably have to fine-tune my model again if I want to use the correct number of layers?

Maybe it also makes sense to include a check that only valid values are included in addition to updating the lora_config.yaml?

    "lora_layers": 32,
    "lora_parameters": {
        "keys": [
            "self_attn.q_proj",
            "self_attn.v_proj",
            "self_attn.k_proj",
            "self_attn.o_proj",
            "mlp.gate_proj",
            "mlp.down_proj",
            "mlp.up_proj"
        ],
...
    "model": "meta-llama/Meta-Llama-3.1-8B-Instruct",
    "num_layers": 16,

Are there any other values that I should update in the config? I have seen that use_dora has also been dropped?

@awni
Copy link
Member

awni commented Oct 23, 2024

  • Yes lora_layers is ignored. It should be removed from the example config.
  • We could have a check.. but then again.. it might be useful to let the user include other stuff in the config (some metadata for their own experiments).

@hschaeufler
Copy link
Contributor Author

hschaeufler commented Oct 23, 2024

  • Yes lora_layers is ignored. It should be removed from the example config.
  • We could have a check.. but then again.. it might be useful to let the user include other stuff in the config (some metadata for their own experiments).

Ok. Thank you. Should I create a PR for the example_config.yaml or do you want to do that?

That's a good point. I didn't realize it was intended to include metadata. I always created an extra json file with additional metadata.

I just realized by coincidence that I still had the old lora_layer-parameters in the config. Maybe a deprecation warning would have helped me to recognize earlier that the parameters had also changed in the yaml. But that's not meant to be a criticism. I am grateful for your work on MLX LM and think you are doing a great job. :)

@awni
Copy link
Member

awni commented Oct 26, 2024

Thanks for the fix! For the future a deprecation warning is the right call.. we'll be more careful there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants