-
-
Notifications
You must be signed in to change notification settings - Fork 615
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add gradient clipping to create_supervised_trainer()
#419
Comments
@lmarti thanks for the feedback. We discussed a similar question in #375. We can discuss whether such trainer could be useful and placed in |
Sorry, I missed that one. I had the same doubts w.r.t. moving it to |
A general way to maintain this would be to fire a new event ( Doesn't have to be added into core events, it can just be added for |
@AntoinePrv I think it would be more simple to write custom processing function instead of custom events. |
@vfdev-5 While I agree with you, it would be nice to have options. In particular, it would be great if we could have more events compared to the fastai callback system. The callbacks listed there are (events in parenthesis):
|
@sudarshan85 we can think about to provide a generic callback class into contrib module. |
@lmarti I am interested in this issue and would like to contribute in this issue. Please assign me this issue. |
@TilakSanghvi you can start from this PR : #1693 and add tests |
It would be good to add gradient clipping to the trainers created by
create_supervised_trainer
. This is already provided bytorch.nn.utils.clip_grad_norm_
.One possible implementation could be:
The text was updated successfully, but these errors were encountered: