-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
clip_gradient with clip_grad_value #5456
Comments
Hey @dhkim0225, Sounds like a great feature. Would you mind making a PR to enable it ? Best regards, |
@tchaton I'm really glad to do that ! Renaming the Do you have any good ideas? Sincerely, |
Hey @dhkim0225, I just checked. You can pass those arguments directly to the Trainer. It will use
Closing this issue as the parameter is already exposed in the Trainer. Feel free to re-open it if you think I missed something. Best, |
🚀 Feature
Same issue with #4927
The current clip_gradient uses clip_grad_norm; can we add clip_grad_value?
https://github.com/PyTorchLightning/pytorch-lightning/blob/f2e99d617f05ec65fded81ccc6d0d59807c47573/pytorch_lightning/plugins/native_amp.py#L63-L65
The text was updated successfully, but these errors were encountered: