-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use "differentiable optimization trick" for backpropagation through tangent vector field calculation #74
Comments
I am seeing some issues with super-long compile times in the optimization context, which are eliminated when we use a |
Yep this sounds like a good plan to me. How hard do we anticipate the manual adjoint will be? |
I am looking at it a bit. It might actually be relatively straightforward. Here's a reference that seems nice, it even includes Jax code: https://implicit-layers-tutorial.org/implicit_functions/ |
@smartalecH @Luochenghuang I have things working here---all it needed was a bit of regularization. |
I think we may want to put this on hold for now: the potential accuracy improvement is small, and there is a speed penalty.
|
Currently, we directly backpropagate through the tangent vector field calculation, which involves a Newton solve to find the minimum of a convex quadratic objective. It may be more efficient to define a custom gradient for this operation, in a manner similar to what is done for differentiable optimization.
The text was updated successfully, but these errors were encountered: