Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix ShardedDataParallel has no attribute require_backward_grad_sync #6915

Merged
merged 7 commits into from
Apr 10, 2021

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Apr 9, 2021

What does this PR do?

Fixes #6804

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

test


test


update test


fix spawn


update plugins


revert


wip


sharded


fix name


c
@awaelchli awaelchli added bug Something isn't working priority: 1 Medium priority task labels Apr 9, 2021
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Neat !

@awaelchli awaelchli marked this pull request as ready for review April 9, 2021 12:21
@kaushikb11 kaushikb11 enabled auto-merge (squash) April 9, 2021 12:25
@pep8speaks
Copy link

pep8speaks commented Apr 9, 2021

Hello @awaelchli! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2021-04-10 14:58:35 UTC

@mergify mergify bot added the has conflicts label Apr 9, 2021
@mergify mergify bot removed the has conflicts label Apr 9, 2021
@awaelchli awaelchli added this to the 1.2.x milestone Apr 9, 2021
@@ -33,6 +36,7 @@ def configure_ddp(self):
self._model = ShardedDataParallel(
LightningShardedDataParallel(self.model), sharded_optimizer=self.lightning_module.trainer.optimizers
)
setattr(self._model, "require_backward_grad_sync", False)
Copy link
Contributor

@carmocca carmocca Apr 9, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: why setattr instead of self._model.require_backward_grad_sync = False?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because the model here is ShardedDataParallel and this attribute is only available for the DistributedDataParallel.
We check for this attribute in manual optmization.

@awaelchli
Copy link
Contributor Author

unrelated (?) tests are failing now for unknown reason :(
it was passing earlier this afternoon
it's very surprising to me

@awaelchli awaelchli added the ready PRs ready to be merged label Apr 10, 2021
@codecov
Copy link

codecov bot commented Apr 10, 2021

Codecov Report

Merging #6915 (d054ed0) into master (20ff50c) will decrease coverage by 0%.
The diff coverage is 90%.

@@          Coverage Diff           @@
##           master   #6915   +/-   ##
======================================
- Coverage      92%     92%   -0%     
======================================
  Files         194     194           
  Lines       12336   12346   +10     
======================================
+ Hits        11322   11328    +6     
- Misses       1014    1018    +4     

@kaushikb11 kaushikb11 merged commit fe0d088 into master Apr 10, 2021
@kaushikb11 kaushikb11 deleted the bugfix/manual_sharded branch April 10, 2021 16:14
@SeanNaren SeanNaren mentioned this pull request Apr 12, 2021
SeanNaren pushed a commit that referenced this pull request Apr 13, 2021
…6915)

Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com>
(cherry picked from commit fe0d088)
lexierule pushed a commit that referenced this pull request Apr 14, 2021
…6915)

Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com>
(cherry picked from commit fe0d088)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working priority: 1 Medium priority task ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

'LightningShardedDataParallel' object has no attribute 'require_backward_grad_sync'
7 participants