A simple reverse-mode automatic differentiation library
This is my first attempt at understanding and implementing automatic differentiation. This is not optimized, and should not be used for any production code. This is also not heavily tested yet, so there may be some bugs waiting around :)
Basic features:
- High order differentiation of scalar functions (Example in high_order.py)
- Matrix/vector function differentiation by simply defining the important matrix operations (matmul, transpose, etc...) as nodes that return the correct VJP (evaluated, not a function, more on this in the TODO section). Example in simple_NN.py
TODO:
- Look more into the unbroadcasting, I think I might be missing some stuff
- Better vectorization?
- Make the grad() method of each node return a function for the VJP, instead of computing it on the spot. This is probably more efficient?
- Implement more examples (ResNet, a meaningful NN exampe, simple GAN, etc...)
- Integrate with Neuro and demonstrate functionality
- This needs ALOT more testing :)
Inspirations: