Skip to content

Commit

Permalink
actual param name for lr
Browse files Browse the repository at this point in the history
  • Loading branch information
kallewoof committed Jul 17, 2024
1 parent b3d35bd commit ed6095a
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions src/peft/optimizers/loraplus.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,8 @@ def create_loraplus_optimizer(
model (`torch.nn.Module`): The model to be optimized.
optimizer_cls (`torch.optim.Optimizer`): The optimizer class to be used.
loraplus_lr_ratio (`float`):
The ratio of learning ηB/ηA where ηA is passed in as the optimizer learning rate. Should be ≥1. Should be
set in tandem with the optimizer learning rate (ηA); should be larger when the task is more difficult and
The ratio of learning ηB/ηA where ηA (lr) is passed in as the optimizer learning rate. Should be ≥1. Should be
set in tandem with the optimizer learning rate (lr); should be larger when the task is more difficult and
the model needs to update its features to learn well. In this case, it helps to make the learning rate
slightly smaller (e.g., by a factor of 2) than typical vanilla LoRA learning rates
loraplus_lr_embedding (optional `float`):
Expand Down

0 comments on commit ed6095a

Please sign in to comment.