-
-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing Cartesian indexing in NoTimeSolution #56
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Codecov Report
@@ Coverage Diff @@
## master #56 +/- ##
==========================================
- Coverage 11.18% 11.18% -0.01%
==========================================
Files 39 39
Lines 2851 2852 +1
==========================================
Hits 319 319
- Misses 2532 2533 +1
Continue to review full report at Codecov.
|
YingboMa
approved these changes
Apr 17, 2021
ChrisRackauckas
added a commit
that referenced
this pull request
Jan 18, 2023
This should be a nice improvement overall to the health of the debugging experience. For example, the code from this post (https://discourse.julialang.org/t/optimizationmoi-ipopt-violating-inequality-constraint/92608) led to a question that took a bit to understand. But now when you run ```julia import Optimization import OptimizationMOI, Ipopt const AV{T} = AbstractVector{T} function model_constraints!(out::AV{<:Real}, u::AV{<:Real}, data) # Model parameters dt, a, b = u out[1] = a - 1/dt # Must be < 0 @info "Must be NEGATIVE: $(out[1])" end function model_variance(u::AV{T}, data::AV{<:Real}) where T<:Real # Model parameters dt, a, b = u # Compute variance variance = zeros(T, length(data)) variance[1] = one(T) for t in 1:(length(data) - 1) variance[t+1] = (1 - dt * a) * variance[t] + dt * data[t]^2 + dt * b end variance end function model_loss(u::AV{T}, data::AV{<:Real})::T where T<:Real variance = model_variance(u, data) loglik::T = zero(T) for (r, var) in zip(data, variance) loglik += -(log(2π) + log(var) + r^2 / var) / 2 end -loglik / length(data) end function model_fit(u0::AV{T}, data::AV{<:Real}) where T<:Real func = Optimization.OptimizationFunction( model_loss, Optimization.AutoForwardDiff(), cons=model_constraints! ) prob = Optimization.OptimizationProblem( func, u0, data, # 0 < dt < 1 && 1 < a < Inf && 0 < b < Inf lb=T[0.0, 1.0, 0.0], ub=T[1.0, Inf, Inf], # ^dt ^a ^b ^dt ^a ^b <= model parameters lcons=T[-Inf], ucons=T[0.0] # a - 1/dt < 0 ) sol = Optimization.solve(prob, Ipopt.Optimizer()) sol.u end let data = [ 2.1217711584057386, -0.28350145551002465, 2.3593492969513004, 0.192856733601849, 0.4566485836385113, 1.332717934013979, -1.286716619379847, 0.9868669960185211, 2.2358674776395224, -2.7933975791568098, 1.2555871497124622, 1.276879759908467, -0.8392016987911409, -1.1580875182201849, 0.33201646080578456, -0.17212553408696898, 1.1275285626369556, 0.23041139849229036, 1.648423577528424, 2.384823597473343, -0.4005518932539747, -1.117737311211693, -0.9490152960583265, -1.1454539355078672, 1.4158585811404159, -0.18926972177257692, -0.2867541528181491, -1.2077459688543788, -0.6397173049620141, 0.66147783407023, 0.049805188778543466, 0.902540117368457, -0.7018417933284938, 0.47342354473843684, 1.2620345361591596, -1.1483844812087018, -0.06487285080802752, 0.39020117013487715, -0.38454491504165356, 1.5125786171885645, -0.6751768274451174, 0.490916740658628, 0.012872300530924086, 0.46532447715746716, 0.34734421531357157, 0.3830452463549559, -0.8730874028738718, 0.4333151627834603, -0.40396180775692375, 2.0794821773418497, -0.5392735774960918, 0.6519326323752113, -1.4844713145398716, 0.3688828625691108, 1.010912990717231, 0.5018274939956874, 0.36656889279915833, -0.11403975693239479, -0.6460314660359935, -0.41997005020823147, 0.9652752515820495, -0.37375868692702047, -0.5780729659197872, 2.642742798278919, 0.5076984117208074, -0.4906395089461916, -1.804352047187329, -0.8596663844837792, -0.7510485548262176, -0.07922589350581195, 1.7201304839487317, 0.9024493222130577, -1.8216089665357902, 1.3929269238775426, -0.08410752079538407, 0.6423068180438288, 0.6615201016351212, 0.18546977816594887, -0.717521690742993, -1.0224309324751113, 1.7748350222721971, 0.1929546575877559, -0.1581871639724676, 0.20198379311238596, -0.6919373947349301, -0.9253274269423383, 0.549366272989534, -1.9302106783541606, 0.7197247279281573, -1.220334158468621, -0.9187468058921053, -2.1452607604834184, -2.1558650694862687, -0.9387913392336701, -0.676637835687265, -0.16621998352492198, 0.5637177022958897, -0.5258315560278541, 0.8413359958184765, -0.9096866525337141 ] # u0 = [0 < dt < 1, 1 < a < 1/dt, 0 < b < Inf] u0 = [0.3, 2.3333333333333335, 0.33333333333333337] @Assert 0 < u0[1] < 1 @Assert 1 < u0[2] < 1 / u0[1] @Assert 0 < u0[3] < Inf @info "Optimizing..." u0 model_fit(u0, data) end ``` you get: ```julia DomainError detected in the user `f` function. This occurs when the domain of a function is violated. For example, `log(-1.0)` is undefined because `log` of a real number is defined to only output real numbers, but `log` of a negative number is complex valued and therefore Julia throws a DomainError by default. Cases to be aware of include: * `log(x)`, `sqrt(x)`, `cbrt(x)`, etc. where `x<0` * `x^y` for `x<0` floating point `y` (example: `(-1.0)^(1/2) == im`) Within the context of SciML, this error can occur within the solver process even if the domain constraint would not be violated in the solution due to adaptivity. For example, an ODE solver or optimization routine may check a step at `new_u` which violates the domain constraint, and if violated reject the step and use a smaller `dt`. However, the throwing of this error will have halted the solving process. Thus the recommended fix is to replace this function with the equivalent ones from NaNMath.jl (https://github.com/JuliaMath/NaNMath.jl) which returns a NaN instead of an error. The solver will then effectively use the NaN within the error control routines to reject the out of bounds step. Additionally, one could perform a domain transformation on the variables so that such an issue does not occur in the definition of `f`. For more information, check out the following FAQ page: https://docs.sciml.ai/Optimization/stable/API/FAQ/#The-Solver-Seems-to-Violate-Constraints-During-the-Optimization,-Causing-DomainErrors,-What-Can-I-Do-About-That? Note that detailed debugging information adds a small amount of overhead to SciML solves which can be disabled with the keyword argument `debug = NoDebug()`. The detailed original error message information from Julia reproduced below: ERROR: DomainError with -2.4941978436429695: log will only return a complex result if called with a complex argument. Try log(Complex(x)). Stacktrace: [1] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any}) @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:66 [2] #solve#572 @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined] [3] solve @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined] [4] model_fit(u0::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77 [5] top-level scope @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88 caused by: DomainError with -2.4941978436429695: log will only return a complex result if called with a complex argument. Try log(Complex(x)). Stacktrace: [1] throw_complex_domainerror(f::Symbol, x::Float64) @ Base.Math .\math.jl:33 [2] _log(x::Float64, base::Val{:ℯ}, func::Symbol) @ Base.Math .\special\log.jl:301 [3] log @ .\special\log.jl:267 [inlined] [4] model_loss(u::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:60 [5] OptimizationFunction @ C:\Users\accou\.julia\dev\SciMLBase\src\scimlfunctions.jl:3580 [inlined] [6] eval_objective(moiproblem::OptimizationMOI.MOIOptimizationProblem{Float64, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Optimization.var"#57#74"{ForwardDiff.GradientConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#60#77"{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#63#80", Optimization.var"#64#81"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Optimization.var"#66#83"{ForwardDiff.JacobianConfig{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3}}}}, Optimization.var"#71#88"{Int64, Vector{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}}}}, Vector{Optimization.var"#69#86"{Int64}}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Matrix{Float64}, Matrix{Float64}, Matrix{Float64}}, x::Vector{Float64}) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:82 [7] eval_objective(model::Ipopt.Optimizer, x::Vector{Float64}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:514 [8] (::Ipopt.var"#eval_f_cb#1"{Ipopt.Optimizer})(x::Vector{Float64}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:597 [9] _Eval_F_CB(n::Int32, x_ptr::Ptr{Float64}, x_new::Int32, obj_value::Ptr{Float64}, user_data::Ptr{Nothing}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:38 [10] IpoptSolve(prob::Ipopt.IpoptProblem) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:442 [11] optimize!(model::Ipopt.Optimizer) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:727 [12] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer; maxiters::Nothing, maxtime::Nothing, abstol::Nothing, reltol::Nothing, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:381 [13] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:327 [14] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any}) @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:59 [15] #solve#572 @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined] [16] solve @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined] [17] model_fit(u0::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77 [18] top-level scope @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88 ```
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
No description provided.