Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] Fix SyncBN build in PyTorch 1.9 #1138

Merged
merged 3 commits into from
Jun 27, 2021
Merged

Conversation

xvjiarui
Copy link
Collaborator

@xvjiarui xvjiarui commented Jun 24, 2021

Motivation

Fix #1137

Modification

Check whether attribute _specify_ddp_gpu_num exists.

BC-breaking (Optional)

Does the modification introduce changes that break the backward-compatibility of the downstream repos?

No

@xvjiarui
Copy link
Collaborator Author

We may need CI for PyTorch 1.9. @zhouzaida

@zhouzaida
Copy link
Collaborator

yet, we are resolving the compatibility in PyTorch1.9

@codecov
Copy link

codecov bot commented Jun 24, 2021

Codecov Report

Merging #1138 (6843309) into master (1b15f02) will increase coverage by 0.00%.
The diff coverage is 100.00%.

❗ Current head 6843309 differs from pull request most recent head d9af6b0. Consider uploading reports for the commit d9af6b0 to get more accurate results
Impacted file tree graph

@@           Coverage Diff           @@
##           master    #1138   +/-   ##
=======================================
  Coverage   67.96%   67.96%           
=======================================
  Files         159      159           
  Lines       10419    10416    -3     
  Branches     1897     1896    -1     
=======================================
- Hits         7081     7079    -2     
  Misses       2969     2969           
+ Partials      369      368    -1     
Flag Coverage Δ
unittests 67.96% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmcv/utils/parrots_wrapper.py 60.71% <ø> (-0.31%) ⬇️
mmcv/cnn/bricks/norm.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1b15f02...d9af6b0. Read the comment docs.

@ZwwWayne
Copy link
Collaborator

Seems this PR does not totally fix #1137 here :

class SyncBatchNorm(SyncBatchNorm_):
def _specify_ddp_gpu_num(self, gpu_size):
if TORCH_VERSION != 'parrots':
super()._specify_ddp_gpu_num(gpu_size)

@zhouzaida
Copy link
Collaborator

We should check why pytorch1.9 removes the function, please refer to Remove _specify_ddp_gpu_num method

@xvjiarui
Copy link
Collaborator Author

We should check why pytorch1.9 removes the function, please refer to Remove _specify_ddp_gpu_num method

They remove it because it's not useful any more. The attribute _ddp_gpu_size is also removed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

SyncBatchNorm breaks after PyTorch 1.9.0
3 participants