Skip to content
This repository has been archived by the owner on Jul 19, 2023. It is now read-only.

Rename the package? #360

Open
ChrisRackauckas opened this issue Mar 28, 2021 · 12 comments
Open

Rename the package? #360

ChrisRackauckas opened this issue Mar 28, 2021 · 12 comments

Comments

@ChrisRackauckas
Copy link
Member

It no longer (and hasn't for a long time) been about defining a DiffEqOperators interface. It's a package for finite difference discretizations of PDEs. What's a good name? Could be like PDEFiniteDiff.jl, or fancy like Fornberg.jl

@valentinsulzer
Copy link
Contributor

I like PDEFiniteDiff.jl better but is the jacvec stuff limited to finite diff or more general?

@ChrisRackauckas
Copy link
Member Author

The jacvec stuff should move to DiffEqBase. This should be a solely finite difference PDE package once that's done.

@yewalenikhil65
Copy link

The jacvec stuff should move to DiffEqBase. This should be a solely finite difference PDE package once that's done.

I wonder if it would not be cooler to include more operators later.. maybe global-representation of functions.. like spectral differential operators (Fourier based / chebyshev differential matrices ). That way it wouldn't remain limited to finite-difference ?

@ChrisRackauckas
Copy link
Member Author

That would just be a different package. This package is so fully focused on finite difference stuff that those pieces wouldn't make sense here.

@danielwe
Copy link

Curious about the status and plans here. I'm getting the impression that this package is somewhat superseded, perhaps bordering on deprecated (and not just incomplete as the warning in the documentation states---btw. would be great to have a similar warning in the readme). However, this seems to be the package that comes closest to what I need at the moment, which is not at all related to PDEs, but needs lazy composition/combination of both array-based and function-based operators for the jacobian. In particular, I have terms that are very low rank (but not sparse) and should be specified as, say, an N x 2 times 2 x N composition. LinearMaps works in a pinch but doesn't have built-in support for state dependence/coefficient updating, and there's not even a workaround if you wanted to update the scalar in a ScaledMap.

What's still missing for my use case is a non-square jacvec operator (for the 2 x N factor), ref. JuliaDiff/SparseDiffTools.jl#184

@xtalax
Copy link
Member

xtalax commented May 17, 2022

This being superseded is a somewhat new phenomenon, and has kind of crept up on us. I still think that the tools here have some merit, but I think that its scope may need to be reduced to be more maintainable. It was mentioned above that the jacvec stuff would be moved to DiffEqBase, is that still planned @ChrisRackauckas?

@danielwe
Copy link

danielwe commented May 17, 2022

Weakly held opinions:

  • Push to have the community (and not just SciML) consolidate around one canonical package for lazy linear map composition. To me, LinearMaps looks like the best candidate, but it needs to support the update mechanism that DiffEqOperators offers, i.e., registering update functions on the leaf operators and calling them recursively from the root. It would also be useful if function maps allowed extra arguments stored on the object (closures break updatability; requiring that the user define a callable type is fine but less friendly).
  • Upstream jacvec and hesvec functions to ForwardDiff, based on the ones in SparseDiffTools.
  • DiffEqBase could then define a JacVec operator that's just a LinearMaps.FunctionMap wrapping the jacvec function from ForwardDiff (and a VecJac that wraps a Zygote pullback). But this isn't crucial if the pieces already exist and work well in their respective packages. Simply having a tutorial example that shows how it can be done may be sufficient.
  • I don't know anything about the PDE discretization stuff but sounds like MethodOfLines is the final destination for that.

@xtalax
Copy link
Member

xtalax commented May 17, 2022

As for point 1, the operators defined here have locally acting stencils, and as such can be composed to multi dimensional GPU convolutions leveraging NNLib - That is the key feature of merit that this package implements, I don't know if LinearMaps also does this however.

As for the rest, I'm not particularly versed in VecJac/JacVec so can't really comment.

@danielwe
Copy link

danielwe commented May 17, 2022

I see! I don't know exactly what that means but it sounds plausible that LinearMaps won't support that any time soon.

I guess I'm at the other end of the spectrum, I landed here while searching for updatable but otherwise straightforward lazy operator compositions and a non-square in-place jacvec operator. There are a couple of combinations of packages that get me 85 % of the way, and I'm happy to contribute to getting all the way there but unsure where my efforts are best spent.

@ChrisRackauckas
Copy link
Member Author

Well, there's a lot to say here. Good points, good opinions.

Push to have the community (and not just SciML) consolidate around one canonical package for lazy linear map composition. To me, LinearMaps looks like the best candidate, but it needs to support the update mechanism that DiffEqOperators offers, i.e., registering update functions on the leaf operators and calling them recursively from the root. It would also be useful if function maps allowed extra arguments stored on the object (closures break updatability; requiring that the user define a callable type is fine but less friendly).

Indeed, without the update functionality LinearMaps just cannot be used by the solvers. So the AbstractSciMLOperator interface is the one to use. We would need any other interface to at least be a superset of that, which nothing is, so that's the one we're going to use and push for.

With that, all of the interface is in SciMLBase, and the basic operators which are defined only using standard library tools should all be in SciMLBase. That's all almost completed, but there's two last operators to migrate:

https://github.com/SciML/DiffEqOperators.jl/blob/master/src/matrixfree_operators.jl
https://github.com/SciML/DiffEqOperators.jl/blob/master/src/composite_operators.jl

Then we should define the operator interface here:

https://scimlbase.sciml.ai/dev/

Or, we should make a SciMLOperator package which is like LinearMaps in that it just defines the AbstractSciMLOperator for other things (such as SciMLBase) to use. Then it would have its own docs. I don't quite have the time for this right now, but if anyone wants to take up the charge this shouldn't take more than a few hours. I would be happy to do the domain for the docs, setup the CI, etc. and tag team it with someone though.

Upstream jacvec and hesvec functions to ForwardDiff, based on the ones in SparseDiffTools.

Indeed that would be good to do. Or, they should be in AbstractDifferentiation.jl, and every differentiation package should just extend that method set. That's our ultimate goal, but I wouldn't be surprised if that takes a year to giggle the interfaces around.

DiffEqBase could then define a JacVec operator that's just a LinearMaps.FunctionMap wrapping the jacvec function from ForwardDiff (and a VecJac that wraps a Zygote pullback). But this isn't crucial if the pieces already exist and work well in their respective packages. Simply having a tutorial example that shows how it can be done may be sufficient.

Note that the JacVec operator is now using SparseDiffTools, and if a Krylov method is used it's done automatically. So the one in DiffEqOperators is pretty much deprecated (like most things here), and there aren't many use cases where a user should need to directly use it themselves (though it would still be nice to expose, indeed at the DiffEqBase level).

I don't know anything about the PDE discretization stuff but sounds like MethodOfLines is the final destination for that.

It definitely is.

So then what is this repo? Pretty much, a dying corpse. Most operators should not have any dependencies and should be in a repo for people to extend. The other ones are for PDE stuff, but for PDE stuff it's more efficient to not use the operator form and instead fuse loops. So this repo is dying a slow death, and we should try to accelerate that death if we can.

@danielwe
Copy link

Thanks for the thorough rundown, super helpful! Gives me a better idea of how to get my stuff up and running in the present state. Hope to come back and help out with the transition once I've got things working.

I understand that you want most users to not have to interact directly with the operator facilities, but they are a fantastic tool to have available sometimes, so it would be great to have them exposed and explained in user-facing documentation eventually.

there aren't many use cases where a user should need to directly use it themselves

My example is a term like B * h(u, p, t) where B is a state-independent matrix (typically tall and narrow; the low rank is why this factorization of the term makes sense; think of it as a feedback control term for a high-dim system with only a few control channels). Using ForwardDiff to turn the entire term into a JVP-operator means propagating dual numbers through the matmul, which means using a generic implementation and doing redundant flops. Specifying the jacobian prototype by hand means you can use a non-square JVP operator for the h(u, p, t)-factor only, and compose it with B. Now you have a float matmul that hits BLAS. Of course, this is not strictly needed, it's just an optimization, but a pretty quick and straightforward one if a JVP operator and operator composition are easily accessible (and it's the reason I've been talking about the need for non-square JVP functions and operators).

(Not sure there's any need a for separate JVP operator type though? If a JVP function and a matrix free operator type both exist, seems like all you'd need is a helper function to wrap the latter around the former.)

@ChrisRackauckas
Copy link
Member Author

https://github.com/SciML/SciMLOperators.jl is chugging along.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants