Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve callback system #947

Closed
5 tasks
hadim opened this issue Feb 26, 2020 · 6 comments · Fixed by #950
Closed
5 tasks

Improve callback system #947

hadim opened this issue Feb 26, 2020 · 6 comments · Fixed by #950
Labels
feature Is an improvement or enhancement help wanted Open to be worked on
Milestone

Comments

@hadim
Copy link
Contributor

hadim commented Feb 26, 2020

Following the new callback system here are a few things we might want to do proposed by people who participated in #889 and #896.

ping @Borda @williamFalcon @jeremyjordan @awaelchli

@hadim hadim added feature Is an improvement or enhancement help wanted Open to be worked on labels Feb 26, 2020
@jeremyjordan
Copy link
Contributor

just waiting for #889 to be merged and I'll pick up the work here

@williamFalcon
Copy link
Contributor

merged!

@jeremyjordan
Copy link
Contributor

@hadim do you have a direct use case for on_fit_start versus on_train_start? we actually don't have access to the model at on_fit_start and i'm wondering if we can just simplify the callbacks by removing on_fit_start and on_fit_end events. i don't think anything important happens between on_fit_start and on_train_start that we would need both...

@jeremyjordan
Copy link
Contributor

also, i appreciate the desire to keep things symmetric but i'd like to remove the pl_module arg from on_init_start and on_init_end since we'd be asking the user to include an arg which they wouldn't be able to access (we pass None as the value internally)

@hadim
Copy link
Contributor Author

hadim commented Feb 27, 2020

I agree.

I thought on_train_start/on_train_end were the equivalent of on_validation_start/on_validation_end but it actually encompass it so it makes on_fit_start and on_fit_end kind of useless in that case.

As for pl_module, sure it makes sense if you are 100% sure that the model will never be available at that moment even in future PL releases or in a complex PL module. I prefer to have a useless argument that will prevent breaking backward compatibility in the future than the opposite. I am not against removing it but I am fine keeping it too.

@jeremyjordan
Copy link
Contributor

I am fairly confident given that we're talking about the init method of the Trainer and not of the actual PL module. I will go ahead and make the change, thanks.

@Borda Borda added this to the 0.7.0 milestone Feb 28, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement help wanted Open to be worked on
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants