Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(training): Allow training on torch xla > 2.3.0, add warning #48

Merged
merged 2 commits into from
Jun 4, 2024

Conversation

tengomucho
Copy link
Collaborator

@tengomucho tengomucho commented Jun 4, 2024

When fine-tuning Gemma-7B on Pytorch XLA 2.3.0, we saw and reported an issue: pytorch/xla#7138
This seems to have been fixed on nightly. This commit relaxes dependency versions and displays a warning when getting FSDP training args for Gemma on 2.3.0.

When fine-tuning Gemma-7B on Pytorch XLA 2.3.0, we saw and reported an
issue. This seems to have been fixed on nightly. This commit relaxes
dependency versions and displays a warning when getting FSDP training
args for Gemma on 2.3.0.
@tengomucho tengomucho requested a review from mfuntowicz June 4, 2024 10:43
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@tengomucho tengomucho merged commit df7884a into main Jun 4, 2024
2 checks passed
@tengomucho tengomucho deleted the training-on-newer-xla-version branch June 4, 2024 13:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants