Source-to-Source Debuggable Derivatives in Pure Python
-
Updated
Sep 29, 2022 - Python
Source-to-Source Debuggable Derivatives in Pure Python
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
End-to-end Generative Optimization for AI Agents
AutoBound automatically computes upper and lower bounds on functions.
Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
Drop-in autodiff for NumPy.
A JIT compiler for hybrid quantum programs in PennyLane
Minimal deep learning library written from scratch in Python, using NumPy/CuPy.
Geometry processing utilities compatible with jax for autodifferentiation.
Сustom torch style machine learning framework with automatic differentiation implemented on numpy, allows build GANs, VAEs, etc.
Differentiable optical models as parameterised neural networks in Jax using Zodiax
JAX-DIPS is a differentiable interfacial PDE solver.
A new lightweight auto-differentation library that directly builds on numpy. Used as a homework for CMU 11785/11685/11485.
A toy deep learning framework implemented in pure Numpy from scratch. Aka homemade PyTorch lol.
Fuzzing Automatic Differentiation in Deep-Learning Libraries (ICSE'23)
Compressible Euler equations solved with finite volume implemented in JAX, plugged into an optimization loop
Add a description, image, and links to the autodiff topic page so that developers can more easily learn about it.
To associate your repository with the autodiff topic, visit your repo's landing page and select "manage topics."