Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose presolve code #68

Closed
joehuchette opened this issue Dec 4, 2020 · 9 comments
Closed

Expose presolve code #68

joehuchette opened this issue Dec 4, 2020 · 9 comments

Comments

@joehuchette
Copy link

@mtanneau mentioned at the last JuMP developers call that he was considering making the presolve code here in Tulip available for use in other packages. I'm starting work on a prototype solver for which this code would be useful. I'll eventually want to build out some MIP presolve routines on top.

What is your preferred way to do this? Would you prefer we depend on Tulip? Or would you consider splitting the code off into a separate package?

cc @BochuanBob and @Anhtu07

@mtanneau
Copy link
Member

mtanneau commented Dec 4, 2020

I think it makes sense to eventually break things off into smaller packages, especially if several projects use it.

At this point, Tulip's presolve is almost self-contained in src/Presolve. It is still tied to some Tulip-level data structures, notably ProblemData and, to a lesser extent, Model and Solution (which handle the interface with the internal IPM optimizers).

My current belief is that a "stand-alone" presolve package should be able to operate as follows:

  • Receive the original problem in some format
  • Perform presolve (this can be black-box)
  • Return presolved model (in a format similar to the original) and necessary ingredients for pre/post crush

Pointer for some inspiration: COIN-OR's OSIPresolve.

Some questions to fuel the discussion:

  • What classes of problems? (MI)LP? (MI)Conic? (MI)NLP?
  • What would it be interfaced to? MOI? Something lower-level?

cc @frapac

@joehuchette
Copy link
Author

joehuchette commented Dec 4, 2020

We only care about MILP.

For our purposes, we don't necessarily want to be tied to a particular solver, and will eventually pipe the model through MOI anyway (to solve and/or bridge). So, working at the MOI level is probably best for us.

Is there anything about your current approach that will not map nicely to MOI?

We will also potentially want to disable certain presolve routines but not others (e.g. in order to keep problem dimensions the same). So baking that into the API would be very useful.

@mtanneau
Copy link
Member

mtanneau commented Dec 4, 2020

Internally, the presolve code works with an LP representation

min    c'x + c₀
s.t.   lr ≤ Ax ≤ ur
       lc ≤  x ≤ uc

where A is stored row-by-row and column-by-column (to allow fast column-wise and row-wise access).
Bounds, right-hand side & integrality requirements (if you were going MILP) are stored in vectors.

To interface with MOI, you need to do the conversion MOI -> LP and then back LP -> MOI.
It's not hard, and should be even easier once the functionalities of MatrixOptInterface get merged into MOI.

@frapac
Copy link

frapac commented Dec 4, 2020

I would be also interested in having a presolve package for https://github.com/exanauts/Simplex.jl
I do not know when MatrixOptInterface will be merged into MOI. Maybe it would make sense to build a MOIPresolve.jl on top of MatOI, but I do not know if that's the best solution available.

@joehuchette
Copy link
Author

It seems reasonable to me to: keep the internal representation the same, and then add an MOI and/or MatrixOptInterface interface on top as-needed. Based on my brief look, I don't see any reason why you could not directly add the MIP information to your internal data structures. Generally, I'm not all that concerned about the indirection of going through MOI, so I'm fine making tying the API to MOI if others are as well.

Ideally, there would also be a programmatic way to configure the presolve! function as well.

@joehuchette
Copy link
Author

(By the way, we have some cycles to spend on making this happen, once we converge on a plan).

@mtanneau
Copy link
Member

mtanneau commented Dec 4, 2020

I suggest we take the discussion over to... 🚧 MathOptPresolve.jl 🚧

@dpo
Copy link

dpo commented Dec 5, 2020

I'm also very interested in building upon Tulip's presolve but I don't use MOI. It would be great to simply pass an LP or QP in "matrix form" and receive a presolved problem in the same format. It seems there could be a low-level API with an MOI layer on top.

@mtanneau
Copy link
Member

This issue has been stale for 2 years; closing.

For reference:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants