Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecate FastChain and sciml_train for v2.0 #794

Merged
merged 36 commits into from
Jan 20, 2023
Merged

Conversation

ChrisRackauckas
Copy link
Member

Superscedes #715

test/neural_dae.jl Outdated Show resolved Hide resolved
Comment on lines +27 to +31
nn = Lux.Chain(x -> x.^3,
Lux.Dense(2, 16, tanh),
Lux.Dense(16, 2))
p_init, st = Lux.setup(rng, nn)
p_init = Lux.ComponentArray(p_init)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Abhishek-1Bhatt did you test Lux with multiple shooting?

@test loss_function(res.minimizer) < l1
res = DiffEqFlux.sciml_train(loss_function, nODE.p, Optim.KrylovTrustRegion(), Optimization.AutoZygote(), maxiters = 100, callback=cb)
res = Optimization.solve(optprob, KrylovTrustRegion(), maxiters=100, callback=cb)
@test loss_function(res.minimizer) < l1
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably also test second order Lux.

Project.toml Outdated Show resolved Hide resolved
@prbzrg
Copy link
Member

prbzrg commented Jan 17, 2023

I think Flux optimizers get converted to Optimisers ones. So it's better to replace OptimizationFlux with OptimizationOptimisers in tests. (I'm not sure)

@ChrisRackauckas
Copy link
Member Author

I think Flux optimizers get converted to Optimisers ones. So it's better to replace OptimizationFlux with OptimizationOptimisers in tests. (I'm not sure)

Yes that is true, but I don't want to mix too many changes at the same time. Since that's separate from DiffEqFlux itself, I'd like to hold off on that. But definitely we want to get rid of Flux.jl and Optim.jl optimizer usage.

@prbzrg
Copy link
Member

prbzrg commented Jan 17, 2023

IMO, for a breaking release, it's better to resolve #744 first. and remove ZygoteRules too.

@ChrisRackauckas
Copy link
Member Author

Yes that should be in the release too. I think it can just be removed and the tests will still pass? I guess just give it a try? If Newton and FFJORD pass then we're good.

import NNlib
import Lux
using Requires
using Cassette
@reexport using Flux
Copy link
Member

@prbzrg prbzrg Jan 17, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now that we support Lux, reexporting Flux could make qualification errors in codes that only have Lux.
Error like:

WARNING: both Lux and Flux export "Chain"; uses of it in module Main must be qualified
ERROR: UndefVarError: Chain not defined

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah we should remove the reexport. Add that to the list of things.

test/mnist_gpu.jl Outdated Show resolved Hide resolved
Co-authored-by: Abhishek Bhatt <46929125+Abhishek-1Bhatt@users.noreply.github.com>
@ChrisRackauckas
Copy link
Member Author

I'm trying to just get this to mergeable now. The plan will be to hold breaking on master for a month (which we normally don't do, but we can make an exception for this big of a deprecation). We should do all of the things like remove the Zygote rule overloads (make sure they are upstreamed), setup LuxCore, update docs to not use OptimizationFlux, etc. and get this 2.0 fully ready. But this keeps being a runaway train of a PR so I want to get something that works, and then continue with improvements that can be done in parallel.

@ChrisRackauckas ChrisRackauckas merged commit f9beff3 into master Jan 20, 2023
@ChrisRackauckas ChrisRackauckas deleted the depremoval2 branch January 20, 2023 20:18
@ChrisRackauckas
Copy link
Member Author

🎉 master now in v2.0 mode, slam it with the breaking changes you want.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants