You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using a single GPU(A10) to run Bloom-560m model fine-tune, error, how to solve? I found similar problems in other projects, but I didn't know how to solve the problems in alpaca Alpha-VLLM/LLaMA2-Accessory#76
The text was updated successfully, but these errors were encountered:
When saving the model checkpoint, an error occurred, which should be a problem with fsdb. My Python version is 3.8.16
Can I solve this problem by deleting the "--fsdp "full_shard auto_wrap"
--fsdp_transformer_layer_cls_to_wrap 'BloomBlock' " from the bash script?
I am using a single GPU(A10) to run Bloom-560m model fine-tune, error, how to solve? I found similar problems in other projects, but I didn't know how to solve the problems in alpaca
Alpha-VLLM/LLaMA2-Accessory#76
The text was updated successfully, but these errors were encountered: