Skip to content

Implementation of Neural ODEs paper (Chen et al., 2018)

Notifications You must be signed in to change notification settings

ktcarr/neural-ode

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

neural-ode

Implementation of Neural ODEs paper ("Neural Ordinary Differential Equations", Chen et al., 2018). For the accompanying introduction to Neural ODEs, click here, or download the file what-are-neural-odes.pdf from above. The original paper by Chen et al. can be found here. For an original implementation from the authors Chen et al., see the torchdiffeq package.

Code examples

For a pared down example of how to train an ODE-net on MNIST, see mnist_trials.ipynb. Note that for this example, I do not implement the adjoint method, but backpropagate directly through the ODE-net (this corresponds to the ''RK-Net'' in Table I of the Neural ODEs paper). To compare direct backpropagation to the adjoint method, I have also the adjoint method from the torchdiffeq package.

For a pared down implementation of the adjoint method, see adjoint.ipynb. In this notebook, I show how to compute the gradient for an ODE solver (in this case, Runge-Kutta), and update parameters, without backpropagating through the solver. The gradients from this custom adjoint method are compared to those obtained from direct backpropagation, and to those obtained using the torchdiffeq package.

ode_example.ipynb shows how to implement Euler and Runge-Kutta ODE solvers and apply them to solve an ODE.

About

Implementation of Neural ODEs paper (Chen et al., 2018)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published