-
-
Notifications
You must be signed in to change notification settings - Fork 611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Depend on Optimisers.jl #1864
Depend on Optimisers.jl #1864
Conversation
@@ -1,12 +1,14 @@ | |||
module Flux | |||
|
|||
# Zero Flux Given |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd totally forgotten about this line 😆
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it would be nice to replace all those using
to import
, but could be done in another PR
src/Flux.jl
Outdated
@reexport using NNlib | ||
|
||
using Zygote |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this can go away (given the line just below)?. Or at least replaced by an import
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually to remove using Zygote
one needs to replace also Flux.withgradient
with Zygote.withgradient
in the recurrent.jl tests for tests to pass.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this could be better served by a separate PR that does some general import cleanup.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I agree.
Optimisers.jl 0.2 is now tagged. |
This is I think the minimal step to make Flux usable with Optimisers.jl: Both should see the same function
trainable
.Will fail until Optimisers.jl 0.2 is tagged; 0.1 does not have this function.
If we aren't deleting implicit parameters from v0.13, can we do any more than this? Maybe we can allow
setup(opt, ::Params)
, and write anupdate!
which uses that. But this PR does not do it.Xref #1481, which completely replaces Flux's code, and I think