Skip to content

Commit

Permalink
fix llama2_70b_lora broken link for Accelerate config file in the rea…
Browse files Browse the repository at this point in the history
…dme (#766)

Co-authored-by: Hiwot Kassa <hiwotkassa@fb.com>
  • Loading branch information
hiwotadese and Hiwot Kassa authored Oct 17, 2024
1 parent cdd928d commit 1c1c619
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llama2_70b_lora/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ accelerate launch --config_file configs/default_config.yaml scripts/train.py \
--seed 1234 \
--lora_target_modules "qkv_proj,o_proj"
```
where the Accelerate config file is [this one](https://github.com/regisss/lora/blob/main/configs/default_config.yaml).
where the Accelerate config file is [this one](https://github.com/mlcommons/training/blob/master/llama2_70b_lora/configs/default_config.yaml).

> Using flash attention with `--use_flash_attn` is necessary for training on 8k-token sequences.
Expand Down

0 comments on commit 1c1c619

Please sign in to comment.