You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Training via DiffEqFlux fails when save_idxs keyword is used within the loss function. Differentiation seems to be the problem. Evaluation of the loss function works fine.
MWE is taken from the docs.
Discussion on julia discourse is here.
using DifferentialEquations, Flux, Optim, DiffEqFlux
functionlotka_volterra!(du, u, p, t)
x, y = u
α, β, δ, γ = p
du[1] = dx = α*x - β*x*y
du[2] = dy =-δ*y + γ*x*y
end# Initial condition
u0 = [1.0, 1.0]
# Simulation interval and intermediary points
tspan = (0.0, 10.0)
tsteps =0.0:0.1:10.0# LV equation parameter. p = [α, β, δ, γ]
p = [1.5, 1.0, 3.0, 1.0]
# Setup the ODE problem, then solve
prob =ODEProblem(lotka_volterra!, u0, tspan, p)
functionloss(p)
sol =solve(prob, Tsit5(), p=p, save_idxs=[2], saveat = tsteps)
loss =sum(abs2, sol.-1)
return loss, sol
end
result_ode = DiffEqFlux.sciml_train(loss, p, ADAM(0.1), maxiters =100)
The text was updated successfully, but these errors were encountered:
Training via DiffEqFlux fails when save_idxs keyword is used within the loss function. Differentiation seems to be the problem. Evaluation of the loss function works fine.
MWE is taken from the docs.
Discussion on julia discourse is here.
The text was updated successfully, but these errors were encountered: