-
-
Notifications
You must be signed in to change notification settings - Fork 398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parametrize JuMP model in optimizer type #1348
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will need convincing benchmarks before approving this. The usability loss from extra complexity in the printouts and debugging statements is pretty terrible.
test/print.jl
Outdated
@@ -116,10 +116,10 @@ end | |||
|
|||
v = [x,y,x] | |||
A = [x y; y x] | |||
io_test(REPLMode, v, "JuMP.VariableRef[x, y, x]") | |||
io_test(REPLMode, v, "JuMP.VariableRef{JuMP.Model{MathOptInterface.Utilities.CachingOptimizer{Union{MathOptInterface.AbstractOptimizer, Void},MathOptInterface.Utilities.UniversalFallback{JuMP.JuMPMOIModel{Float64}}}}}[x, y, x]") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
JuMP.VariableRef{JuMP.Model{MathOptInterface.Utilities.CachingOptimizer{Union{MathOptInterface.AbstractOptimizer, Void},MathOptInterface.Utilities.UniversalFallback{JuMP.JuMPMOIModel{Float64}}}}}
is not ok for user-facing printouts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is fixed
Codecov Report
@@ Coverage Diff @@
## master #1348 +/- ##
==========================================
+ Coverage 89.33% 89.34% +<.01%
==========================================
Files 24 24
Lines 3386 3368 -18
==========================================
- Hits 3025 3009 -16
+ Misses 361 359 -2
Continue to review full report at Codecov.
|
src/print.jl
Outdated
@@ -130,7 +130,17 @@ function var_str(::Type{IJuliaMode}, v::AbstractVariableRef; mathmode=true) | |||
return math("noname", mathmode) | |||
end | |||
end | |||
|
|||
# We want arrays of variables to be printed with `JuMP.VariableRef` as eltype |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Eh, this is another chunk of code that will break between julia versions. Weird printing will also confuse developers when they try to create arrays to store VariableRef
s.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, but it improves printing :-P Ok I'll revert this ^^
This reverts commit 16c4465.
I have added a benchmark, see the initial comment for the results of the benchmark |
The overhead of having an untyped backend is now 1/3 for both time and space in julia> using JuMP
julia> const model = Model()
A JuMP Model
julia> using BenchmarkTools
julia> @btime @variable(model)
33.454 ns (2 allocations: 48 bytes)
noname By annotating the type of the backend in https://github.com/JuliaOpt/JuMP.jl/blob/a0af39df82fce30cce0821f4f08ec7b13d710b01/src/variables.jl#L206 julia> using JuMP
julia> const model = Model()
A JuMP Model
julia> using BenchmarkTools
julia> @btime @variable(model)
18.558 ns (1 allocation: 32 bytes)
noname |
I was looking into this a bit last night. I couldn't figure out why exactly Julia is allocating (I couldn't make a small example of an allocation when dispatching on an object whose type is unknown). Maybe there's a trick to fix it. Another option without parameterizing the model is to do the work in batch and call |
The batch idea could help. For the 32 remaining bytes, it is due to the fact that VariableRef is not |
What is the status of this? |
The optimizer type allows zero overhead on the JuMP side.
The variable type therefore now need to be parametrized on the model type so that its field are concretely typed which induces many changes of this PR.
Benchmarking results
This use the benchmarking file in
test/perf/backend_overhead.jl
.Before this change:
After this change:
Related to jump-dev/MathOptInterface.jl#321