Just a differentiation library
Create a matrix and a vector:
using Judi
@matrix A
@vector x
Create an expression:
expr = x' * A * x
The variable expr
now contains an internal representation of the expression x' * A * x
.
Compute the gradient and the Hessian with respect to the vector x
.
g = gradient(expr, x)
H = hessian(expr, x)
Convert the gradient and the Hessian into standard notation using to_std_string
:
to_std_string(g) # "Aᵀx + Ax"
to_std_string(H) # "Aᵀ + A"
Jacobians can be computed with jacobian
:
to_std_string(jacobian(A * x, x)) # "A"
The method derivative
can be used to compute arbitrary derivatives.
to_std_string(derivative(tr(A), A)) # "I"
The method to_std_string
will throw an exception when given an expression that that cannot be converted to
standard notation.
tr
, sin
, cos
, +
, -
, *
, '
, .*
This library is not yet published in the general registry. To install it directly from Github:
using Pkg; Pkg.add("https://github.com/asterycs/Judi.jl.git")
The implementation is based on the ideas presented in
S. Laue, M. Mitterreiter, and J. Giesen. Computing Higher Order Derivatives of Matrix and Tensor Expressions, NeurIPS 2018.