Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(llama.cpp): support lora with scale and yarn #1277

Merged
merged 2 commits into from
Nov 11, 2023
Merged

Conversation

mudler
Copy link
Owner

@mudler mudler commented Nov 11, 2023

Description

This PR is part of #1255 and closes #1271

It adds support for lora and yarn, in the config:

lora_base: ""
lora_adapter: ""
lora_scale: 0.4
rope_scaling: "yarn" # OR: none. Defaults to linear (not set)
rope_freq_base: 0.0
rope_freq_scale: 0.0
yarn_ext_factor: 0.0
yarn_attn_factor: 0.0
yarn_beta_fast: 0.0
yarn_beta_slow: 0.0

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
@mudler mudler linked an issue Nov 11, 2023 that may be closed by this pull request
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
@mudler mudler changed the title feat(llama.cpp): support lora with scale feat(llama.cpp): support lora with scale and yarn Nov 11, 2023
@mudler mudler merged commit 803a0ac into master Nov 11, 2023
14 checks passed
@mudler mudler deleted the lora_llama-cpp branch November 11, 2023 17:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

feat: support yarn
1 participant