-
-
Notifications
You must be signed in to change notification settings - Fork 611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Flux.Optimise.update!
updating grads instead of params?
#2121
Comments
I am 99% sure this is not a bug, and I am just doing something weird. But perhaps the fact that I am getting this behavior and cannot figure out what I am doing wrong points to an issue in documentation. |
Pasting that in I get initial & final loss Flux.Optimise does mutate the gradients. #2098 removed one effect of this (on v0.13.8) but not the one seen here. |
Hmm, you are right. I cannot reproduce the behavior I was observing anymore.
|
The problem was not recomputing |
Package Version
v0.13.7
Julia Version
1.8.2
OS / Environment
Windows 111
Describe the bug
Flux.Optimise.update!
seems to update grads instead of params. I must be doing something wrong but this is the result I am getting.Steps to Reproduce
Expected Results
I was expecting the
params(predict)
to change, and the loss to go down.Observed Results
Instead, the grads changed, and the loss and parameters of the NN did not change.
Relevant log output
No response
The text was updated successfully, but these errors were encountered: