-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix default logging levels for train step specific hooks #10756
Conversation
5942c9c
to
08d14ef
Compare
Build Error! No Linked Issue found. Please link an issue or mention it in the body using #<issue_id> |
Codecov Report
@@ Coverage Diff @@
## master #10756 +/- ##
=======================================
- Coverage 92% 92% -0%
=======================================
Files 177 177
Lines 16435 16397 -38
=======================================
- Hits 15088 15052 -36
+ Misses 1347 1345 -2 |
pytorch_lightning/trainer/connectors/logger_connector/fx_validator.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
Co-authored-by: thomas chaton <thomas@grid.ai>
What does this PR do?
The following hooks had levels
on_step=False
andon_epoch=True
by default:now they are logged with
on_step=True
andon_epoch=False
by default. Also added a test to check for default levels for all hooks.Does your PR introduce any breaking changes? If yes, please list them.
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃
cc @Borda @carmocca @edward-io @ananthsub