Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FSDP][optim_state_dict] Make FSDP optim_state_dict aware of DDP prefix #96415

Closed
wants to merge 1 commit into from

Conversation

fegin
Copy link
Contributor

@fegin fegin commented Mar 9, 2023

Summary: When wrapping FSDP within DDP, optimizer state_dict may be broken due to the prefix of DDP. This PR fixes the issue.

Test Plan: CI

Differential Revision: D43893609

@pytorch-bot
Copy link

pytorch-bot bot commented Mar 9, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/96415

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 0a2ac3e:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D43893609

@fegin fegin added the ciflow/trunk Trigger trunk jobs on your pull request label Mar 9, 2023
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D43893609

fegin added a commit to fegin/pytorch that referenced this pull request Mar 9, 2023
…ix (pytorch#96415)

Summary:
Pull Request resolved: pytorch#96415

When wrapping FSDP within DDP, optimizer state_dict may be broken due to the prefix of DDP. This PR fixes the issue.

Test Plan: CI

Reviewed By: zhaojuanmao

Differential Revision: D43893609

fbshipit-source-id: 32039229f56c052797b558e0266a5798332d792e
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D43893609

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D43893609

…ix (pytorch#96415)

Summary:
Pull Request resolved: pytorch#96415

When wrapping FSDP within DDP, optimizer state_dict may be broken due to the prefix of DDP. This PR fixes the issue.

Test Plan: CI

Reviewed By: zhaojuanmao

Differential Revision: D43893609

fbshipit-source-id: efac71301a919bf57f63f3f2bfd4795d0ebb3e83
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D43893609

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge

(Initiating merge automatically since Phabricator Diff has merged)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

cyyever pushed a commit to cyyever/pytorch_private that referenced this pull request Mar 23, 2023
…ix (#96415)

Summary: When wrapping FSDP within DDP, optimizer state_dict may be broken due to the prefix of DDP. This PR fixes the issue.

Test Plan: CI

Differential Revision: D43893609

Pull Request resolved: pytorch/pytorch#96415
Approved by: https://github.com/zhaojuanmao
cyyever pushed a commit to cyyever/pytorch_private that referenced this pull request Mar 27, 2023
…ix (#96415)

Summary: When wrapping FSDP within DDP, optimizer state_dict may be broken due to the prefix of DDP. This PR fixes the issue.

Test Plan: CI

Differential Revision: D43893609

Pull Request resolved: pytorch/pytorch#96415
Approved by: https://github.com/zhaojuanmao
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/trunk Trigger trunk jobs on your pull request fb-exported Merged release notes: distributed (fsdp) release notes category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants