Skip to content

Commit

Permalink
[FSDP][optim_state_dict] Make FSDP optim_state_dict aware of DDP pref…
Browse files Browse the repository at this point in the history
…ix (#96415)

Summary: When wrapping FSDP within DDP, optimizer state_dict may be broken due to the prefix of DDP. This PR fixes the issue.

Test Plan: CI

Differential Revision: D43893609

Pull Request resolved: pytorch/pytorch#96415
Approved by: https://github.com/zhaojuanmao
  • Loading branch information
fegin authored and cyyever committed Mar 27, 2023
1 parent 78203a5 commit 921de41
Showing 1 changed file with 9 additions and 0 deletions.
9 changes: 9 additions & 0 deletions torch/distributed/fsdp/_common_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -299,6 +299,15 @@ def f(module: torch.nn.Module, prefix: str, *args, **kwargs):
f"submodule_name = {submodule_name}"
)
new_prefix = prefix
elif submodule_name == "module":
warnings.warn(
"An unexpected prefix is detected. This case "
" should only happen when DDP wraps the outer "
" modules while FSDP wraps the inner ones."
f"prefix = {prefix}, "
f"submodule_name = {submodule_name}"
)
new_prefix = prefix
f(submodule, new_prefix, *args, **kwargs)

f(root_module, "", *args, **kwargs)
Expand Down

0 comments on commit 921de41

Please sign in to comment.