-
-
Notifications
You must be signed in to change notification settings - Fork 393
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorrect dual variable extracted from equality constraint between Hermitian matrices #3796
Comments
Moved this to MOI because I assume it's an issue in the bridge |
Oh wait. Hypatia supports this directly, so there is no bridge. |
Using using JuMP
using LinearAlgebra
import Hypatia
import Clarabel
import Ket
import Random
function dual_test(d)
Random.seed!(1)
ρ = Ket.random_state(d^2, 1)
model = Model(Hypatia.Optimizer; add_bridges = false)
@variable(model, σ[1:d^2, 1:d^2] in HermitianPSDCone())
@variable(model, λ)
noisy_state = Hermitian(ρ + λ * I(d^2))
@constraint(model, witness_constraint, σ == noisy_state)
@objective(model, Min, λ)
@constraint(model, Ket.partial_transpose(σ, 2, [d, d]) in HermitianPSDCone())
optimize!(model)
W = dual(witness_constraint)
display(W)
display(MOI.get(backend(model), MOI.ConstraintDual(), index(witness_constraint)))
FIX = 0.5 * Hermitian(W + Diagonal(W))
return objective_value(model), dot(ρ, FIX)
end
dual_test(2) I get
so the triangle dual that get from Hypatia is the same constants as those that JuMP reports, just reshaped. |
I believe the issue is that what Hypatia sees is just a vector constraint; as an inner product between two vectors this dual variable is correct. JuMP, on the other hand, interprets this as an inner product between two matrices, which consists of not only the upper triangular that goes to Hypatia but also the redundant lower triangular. Well this redundant part makes the off-diagonal terms appear twice. Not that if I use |
Hypatia needs the |
The issue is not restricted to Hypatia. I've also tried with Clarabel, SCS, and Mosek, and all have exactly the same problem, for both the real and complex PSD cones. I've also tried editing Hypatia's |
Yes, I think this is correct. |
Here's a better example of what is happening: julia> using JuMP
julia> using Clarabel
julia> using Hypatia
julia> using SCS
julia> using LinearAlgebra
julia> function hermitian_equality(optimizer)
model = Model(optimizer)
set_silent(model)
@variable(model, x[1:2, 1:2])
@objective(model, Min, sum(x))
y = [x[1,1] (x[1,2]+x[2,1]im); (x[1,2]-x[2,1]im) x[2,2]]
z = [1 2+3im; 2-3im 4]
@constraint(model, c, Hermitian(y - z) == 0)
optimize!(model)
@assert is_solved_and_feasible(model)
return dual(c)
end
hermitian_equality (generic function with 1 method)
julia> function hermitian_broadcast(optimizer)
model = Model(optimizer)
set_silent(model)
@variable(model, x[1:2, 1:2])
@objective(model, Min, sum(x))
y = [x[1,1] (x[1,2]+x[2,1]im); (x[1,2]-x[2,1]im) x[2,2]]
z = [1 2+3im; 2-3im 4]
@constraint(model, c, y .== z)
optimize!(model)
@assert is_solved_and_feasible(model)
return dual.(c)
end
hermitian_broadcast (generic function with 1 method)
julia> hermitian_equality(Clarabel.Optimizer)
2×2 Hermitian{ComplexF64, Matrix{ComplexF64}}:
1.0+0.0im 1.0+1.0im
1.0-1.0im 1.0+0.0im
julia> hermitian_broadcast(Clarabel.Optimizer)
2×2 Matrix{ComplexF64}:
1.0+0.0im 0.5+0.5im
0.5-0.5im 1.0+0.0im
julia> hermitian_equality(Hypatia.Optimizer)
2×2 Hermitian{ComplexF64, Matrix{ComplexF64}}:
1.0+0.0im 1.0+1.0im
1.0-1.0im 1.0+0.0im
julia> hermitian_broadcast(Hypatia.Optimizer)
2×2 Matrix{ComplexF64}:
1.0+0.0im 0.0+0.0im
1.0-1.0im 1.0+0.0im
julia> hermitian_equality(SCS.Optimizer)
2×2 Hermitian{ComplexF64, Matrix{ComplexF64}}:
1.0+0.0im 1.0+1.0im
1.0-1.0im 1.0+0.0im
julia> hermitian_broadcast(SCS.Optimizer)
2×2 Matrix{ComplexF64}:
1.0+0.0im 0.5+0.5im
0.5-0.5im 1.0+0.0im |
This also happens for julia> function hermitian_equality(optimizer)
model = Model(optimizer)
set_silent(model)
@variable(model, x[1:2, 1:2])
@objective(model, Min, sum(x))
y = [x[1,1] (x[1,2]+x[2,1]im); (x[1,2]+x[2,1]im) x[2,2]]
z = [1 2+3im; 2+3im 4]
@constraint(model, c, Symmetric(y - z) == 0)
optimize!(model)
@assert is_solved_and_feasible(model)
return dual(c)
end
hermitian_equality (generic function with 1 method)
julia> function hermitian_broadcast(optimizer)
model = Model(optimizer)
set_silent(model)
@variable(model, x[1:2, 1:2])
@objective(model, Min, sum(x))
y = [x[1,1] (x[1,2]+x[2,1]im); (x[1,2]+x[2,1]im) x[2,2]]
z = [1 2+3im; 2+3im 4]
@constraint(model, c, y .== z)
optimize!(model)
@assert is_solved_and_feasible(model)
return dual.(c)
end
hermitian_broadcast (generic function with 1 method)
julia> hermitian_equality(Clarabel.Optimizer)
2×2 Symmetric{ComplexF64, Matrix{ComplexF64}}:
1.0+0.0im 1.0+1.0im
1.0+1.0im 1.0+0.0im
julia> hermitian_broadcast(Clarabel.Optimizer)
2×2 Matrix{ComplexF64}:
1.0+0.0im 0.5+0.5im
0.5+0.5im 1.0+0.0im
julia> hermitian_equality(Hypatia.Optimizer)
2×2 Symmetric{ComplexF64, Matrix{ComplexF64}}:
1.0+0.0im 1.0+1.0im
1.0+1.0im 1.0+0.0im
julia> hermitian_broadcast(Hypatia.Optimizer)
2×2 Matrix{ComplexF64}:
1.0+0.0im 0.0+0.0im
1.0+1.0im 1.0+0.0im
julia> hermitian_equality(SCS.Optimizer)
2×2 Symmetric{ComplexF64, Matrix{ComplexF64}}:
1.0+0.0im 1.0+1.0im
1.0+1.0im 1.0+0.0im
julia> hermitian_broadcast(SCS.Optimizer)
2×2 Matrix{ComplexF64}:
1.0+0.0im 0.5+0.5im
0.5+0.5im 1.0+0.0im So it is just the case that when we add these symmetric equality constraints we're doubling up on the duals. |
When I use a constraint of the form
@constraint(model, A == B)
, whereA
andB
are Hermitian matrices, for some reason the dual variable associated to this constraint has the off-diagonal elements multiplied by 2. I've tested a couple of different solvers, so the bug must be inside JuMP itself. A MWE follows below. Note the text-to-last line, where I'm dividing the off-diagonal elements of the dual variable by 2 to make it match the primal result.The text was updated successfully, but these errors were encountered: