Skip to content

Commit

Permalink
Add a pretrained guard for the initial meta init on global rank 0 (#1397
Browse files Browse the repository at this point in the history
)
  • Loading branch information
irenedea authored Jul 26, 2024
1 parent e882658 commit bb385f6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llmfoundry/models/hf/hf_causal_lm.py
Original file line number Diff line number Diff line change
Expand Up @@ -284,7 +284,7 @@ def _autoset_attn_implementation_monkeypatch(
# the different processes. To avoid this contention, we first create the model (on meta device) on local rank
# zero. This will set up the transformers model cache and avoid the future contention.
if dist.get_local_rank() == 0:
if os.path.isdir(pretrained_model_name_or_path):
if pretrained and os.path.isdir(pretrained_model_name_or_path):
with init_empty_weights(include_buffers=False):
with warnings.catch_warnings():
warnings.simplefilter('ignore', UserWarning)
Expand Down

0 comments on commit bb385f6

Please sign in to comment.