Skip to content

Commit

Permalink
missing LoraConfig import
Browse files Browse the repository at this point in the history
  • Loading branch information
kallewoof committed Jul 23, 2024
1 parent 91150a5 commit d2b3f9e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/source/developer_guides/lora.md
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ LoRA training can optionally include special purpose optimizers. Currently the o
LoRA training can be optimized using [LoRA+](https://arxiv.org/abs/2402.12354), which uses different learning rates for the adapter matrices A and B, shown to increase finetuning speed by up to 2x and performance by 1-2%.

```py
from peft import get_peft_model
from peft import LoraConfig, get_peft_model
from peft.optimizers import create_loraplus_optimizer
from transformers import Trainer
import bitsandbytes as bnb
Expand Down

0 comments on commit d2b3f9e

Please sign in to comment.