You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The output here should be a BitMatrix, or, if you care about type stability, which is quite reasonable, a Matrix{Float64} with entries 0.0 and 1.0. Some stray -0.0s aren't "wrong" in the sense that -0.0 == 0.0, but unless they convey some meaningful information that sign bit should be normalized to 0.0:
julia> using JuMP
julia> import HiGHS
julia> functionbin_pack(capacities::AbstractVector{<:Real}, sizes::AbstractVector{<:Real})
model = Model(HiGHS.Optimizer)
@variable(model, x[eachindex(sizes), eachindex(capac
@objective(model, Max, sum(x))
@constraint(model, sum(x, dims=2) .<= 1)
@constraint(model, sizes' * x .<= capacities')
set_silent(model)
optimize!(model)
value.(x)
end
bin_pack (generic function with 1 method)
julia> bin_pack([4,5], [3,2,2,2])
4×2 Matrix{Float64}:
0.0 1.0
1.0 -0.0
1.0 -0.0
-0.0 1.0
The text was updated successfully, but these errors were encountered:
This is expected behavior that we won't be fixing.
Solvers, even when they have "binary" variables in fact solve in double precision with a tolerance (the HiGHS default is 1e-6), so anything in [N - tol, N + tol] is considered "integer" when N is an integer.
Note that, in most cases you can round the solution to recover integrality, but in some cases, the rounded solution may violate your constraints by more than the primal feasibility tolerance.
The output here should be a BitMatrix, or, if you care about type stability, which is quite reasonable, a
Matrix{Float64}
with entries0.0
and1.0
. Some stray-0.0
s aren't "wrong" in the sense that-0.0 == 0.0
, but unless they convey some meaningful information that sign bit should be normalized to0.0
:The text was updated successfully, but these errors were encountered: