-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatically find and run special tests #6669
Conversation
Seems like parenthesis notation is not supported in CI
38c961e
to
d25bceb
Compare
Codecov Report
@@ Coverage Diff @@
## master #6669 +/- ##
========================================
- Coverage 91% 83% -8%
========================================
Files 192 193 +1
Lines 12248 12981 +733
========================================
- Hits 11162 10771 -391
- Misses 1086 2210 +1124 |
One test seems to be failing even though it's shown as Traceback (most recent call last):
File "/__w/1/s/pytorch_lightning/trainer/trainer.py", line 599, in run_train
self.train_loop.run_training_epoch()
File "/__w/1/s/pytorch_lightning/trainer/training_loop.py", line 547, in run_training_epoch
epoch_output, self.checkpoint_accumulator, self.early_stopping_accumulator, self.num_optimizers
File "/__w/1/s/pytorch_lightning/trainer/connectors/logger_connector/logger_connector.py", line 442, in log_train_epoch_end_metrics
self.training_epoch_end(model, epoch_output, num_optimizers)
File "/__w/1/s/pytorch_lightning/trainer/connectors/logger_connector/logger_connector.py", line 494, in training_epoch_end
epoch_output = model.training_epoch_end(epoch_output)
File "/__w/1/s/tests/utilities/test_all_gather_grad.py", line 70, in training_epoch_end
assert gathered_loss["losses_tensor_int"][0].dtype == torch.int64
AssertionError: assert torch.int32 == torch.int64
+torch.int32
-torch.int64 cc: @tchaton as the failing test author edit: resolved |
Hello @carmocca! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-03-26 16:41:50 UTC |
Nice to see this automated. This will be useful for new contributors who don't know how these special tests work. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've seen this pattern in other repos, hopefully we can get this fix into other peoples' CI!
Fantastic work @carmocca :)
What does this PR do?
Tired of having an outdated
special_tests.sh
file :)benchmarks
special tests are also included nowDDPLauncher
Produces the following report:
Fixes https://github.com/PyTorchLightning/internal-dev/issues/125
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list: