Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

updated sync bn #2838

Merged
merged 13 commits into from
Aug 5, 2020
Merged

updated sync bn #2838

merged 13 commits into from
Aug 5, 2020

Conversation

ananyahjha93
Copy link
Contributor

What does this PR do?

Fixes # (issue)

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team August 5, 2020 18:40
@mergify mergify bot requested a review from a team August 5, 2020 18:57
@nateraw nateraw self-requested a review August 5, 2020 19:10
Copy link
Contributor

@nateraw nateraw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update trainer args so drone passes

@mergify mergify bot requested a review from a team August 5, 2020 19:11
@ananyahjha93
Copy link
Contributor Author

@williamFalcon not sure why the drone test is failing, the same test passes locally and was passing in the previous commit

@williamFalcon
Copy link
Contributor

we don’t need that anymore since u added the test using ddp spawn no?
we don’t need the example anymore

@ananyahjha93
Copy link
Contributor Author

ananyahjha93 commented Aug 5, 2020

ddp_spawn test is called using the example, as a script test
@williamFalcon ^^^

@Borda
Copy link
Member

Borda commented Aug 5, 2020

we don’t need that anymore since u added the test using ddp spawn no?
we don’t need the example anymore

the test is the example, maybe we shall put it to test as some other people crossing over would raise the same question...

@pep8speaks
Copy link

pep8speaks commented Aug 5, 2020

Hello @ananyahjha93! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-08-05 22:55:59 UTC

@codecov
Copy link

codecov bot commented Aug 5, 2020

Codecov Report

Merging #2838 into master will increase coverage by 4%.
The diff coverage is 50%.

@@           Coverage Diff           @@
##           master   #2838    +/-   ##
=======================================
+ Coverage      86%     90%    +4%     
=======================================
  Files          78      78            
  Lines        7029    7029            
=======================================
+ Hits         6049    6338   +289     
+ Misses        980     691   -289     

@Borda Borda added bug Something isn't working feature Is an improvement or enhancement labels Aug 5, 2020
@nateraw nateraw self-requested a review August 5, 2020 22:56
@Borda Borda added ready PRs ready to be merged priority: 0 High priority task labels Aug 5, 2020
@Borda Borda merged commit a5f2b89 into master Aug 5, 2020
@Borda Borda deleted the sync_batchnorm branch August 5, 2020 23:12
Copy link
Contributor

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice addition @ananyahjha93 !


def prepare_data(self):
# download only
MNIST(self.data_dir, train=True, download=True, normalize=(0.1307, 0.3081))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to double check, do we need TrialMNIST or MNIST?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not want to change TrialMNIST structure for this PR as I had to include a DistributedSampler with shuffle

Comment on lines +83 to +84
dm.prepare_data()
dm.setup(stage=None)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought the Trainer / LightningModule will call these automatically? @nateraw

@Borda Borda modified the milestones: 1.0.0, 0.9.0 Aug 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working feature Is an improvement or enhancement priority: 0 High priority task ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants