Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add OptimizationSenseAtom #636

Closed
wants to merge 2 commits into from
Closed

Add OptimizationSenseAtom #636

wants to merge 2 commits into from

Conversation

odow
Copy link
Member

@odow odow commented May 8, 2024

Closes #310

Copy link

codecov bot commented May 8, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.87%. Comparing base (8a91e05) to head (50cce80).

Additional details and impacted files
@@           Coverage Diff           @@
##           master     #636   +/-   ##
=======================================
  Coverage   97.86%   97.87%           
=======================================
  Files          88       89    +1     
  Lines        5114     5128   +14     
=======================================
+ Hits         5005     5019   +14     
  Misses        109      109           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@odow
Copy link
Member Author

odow commented May 8, 2024

I don't know what's up with nightly

@blegat
Copy link
Member

blegat commented May 8, 2024

Looks good, maybe add some docstring for what this is used for ? IIUC, this is to get an error if we don't use it the right way, e.g.,

t = Variable()
add_constraint!(t, t >= x)
add_constraint!(t, t >= -x)
maximize(t)

Here, because we maximize t, our formulation does not yield the absolute value anymore
Now if you do

maximize(OptimizationSenseAtom(t, MOI.MIN_SENSE))

then you get an error saying it's not DCP because t is convex.

@ericphanson
Copy link
Collaborator

ericphanson commented May 8, 2024

Looking at this more, I think we are close to this working with Problems, which gives us the nicer syntax of minimize/maximize, and potentially more natural places to put constraints:

using Convex, LinearAlgebra, Clarabel
using Convex: AbstractExpr

# monkeypatch the existing `getproperty`
function Base.getproperty(p::Problem, s::Symbol)
    if s === :optval
        if getfield(p, :status) == Convex.MOI.OPTIMIZE_NOT_CALLED
            return nothing
        else
            return Convex.objective_value(p)
        end
    elseif s === :size
        return p.objective.size
    end
    return getfield(p, s)
end

function lamb_min(A::AbstractExpr)
    t = Variable()
    n = size(A, 1)
    n == size(A,2) || throw(ArgumentError())
    p = maximize(t, A - t*Matrix(1.0I, n, n)  0)
    return p
end

p = maximize( lamb_min(A) + 1, [ A >= 0, A[1,1] == 2.0] )

solve!(p, Clarabel.Optimizer)
julia> print(p.model)
Maximize ScalarAffineFunction{Float64}:
 1.0 + 1.0 v[5]

Subject to:

VectorAffineFunction{Float64}-in-Zeros
 ┌               ┐
 │-2.0 + 1.0 v[1]│
 └               ┘  Zeros(1)

VectorAffineFunction{Float64}-in-Nonnegatives
 ┌              ┐
 │0.0 + 1.0 v[1]│
 │0.0 + 1.0 v[2]│
 │0.0 + 1.0 v[3]│
 │0.0 + 1.0 v[4]│
 └              ┘  Nonnegatives(4)

VectorAffineFunction{Float64}-in-PositiveSemidefiniteConeSquare
 ┌                                    ┐
 │0.0 + 1.0 v[1] - 1.0 v[5]           │
 │0.0 + 1.0 v[3] + 1.0 v[2] - 1.0 v[3]│
 │0.0 + 1.0 v[3]                      │
 │0.0 + 1.0 v[4] - 1.0 v[5]           │
 └                                    ┘  PositiveSemidefiniteConeSquare(2)

However, it's not fully correct yet, since if I flip maximize to minimize in the definition of lamb_min I don't get an error.

IMO if we get this working, it will be easier for users than introducing a new atom.

@odow
Copy link
Member Author

odow commented May 8, 2024

Okay let me take a look

@odow
Copy link
Member Author

odow commented May 8, 2024

Closing for now. I'll take another shot at this.

@odow odow closed this May 8, 2024
@odow odow deleted the od/optimization-sense branch May 8, 2024 23:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

Partially specified problems
3 participants