-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ENH: minimize_ipopt: add input validation for bounds
and support for new-style constraints
#207
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some self-review to aid the reviewer.
@@ -19,7 +19,7 @@ | |||
else: | |||
SCIPY_INSTALLED = True | |||
del scipy | |||
from scipy.optimize import approx_fprime | |||
from scipy import optimize |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I needed some other things from optimize
.
@@ -486,13 +486,11 @@ def minimize_ipopt(fun, | |||
(fun, x0, args, kwargs, method, jac, hess, hessp, | |||
bounds, constraints, tol, callback, options) = res | |||
|
|||
_x0 = np.atleast_1d(x0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Conversion to 1d is now done in _minimize_ipopt_iv
.
Previously, the variable was renamed to _x0
so that at the end, if x0
had been a scalar, the x
attribute of the result object would be a scalar.
In SciPy, I am very careful about shapes, and I have corrected many shape bugs related to this sort of thing in SciPy code. However, I'm not sure if this is needed here.
The documentation (which now mirrors that of scipy.optimize.minimize
) states that x0
should be an array-like of shape (n,)
, where n
is the number of independent variables. This function does not handle arrays of arbitrary dimensionality, and there is a separate minimize_scalar
function for functions of a single variable. Also, the x
attribute of the object returned by scipy.optimize.minimize
returns 1d array even when x0 is a scalar:
from scipy.optimize import minimize
def f(x):
return x**2
minimize(f, 1).x # [-7.450e-09]
All these things suggest to me that the output should always be a 1d array (even if the function is flexible about accepting scalar input rather than raising an error)and working with functions that return scalars).
LMK if you'd prefer for this to be left alone.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If our function here supported scalar and array objective functions simultaneously, I think we'd need to deprecate the scalar functionality with at least a release.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would not change what types of objective functions are supported. It makes the function more consistent with the documentation, which states:
In SciPy, the x
attribute of the result object is always 1D. However, currently in minimize_ipopt
, the x
attribute of the result object may be a scalar. The proposed change would resolve this inconsistency and make minimize_ipopt
behave as advertised - like scipy.optimize.minimize
.
It may surprise users who relied on the result attribute x
being a scalar in some cases, but this behavior wasn't consistent with the documentation, so it could be considered a bug fix.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok. I consider this backwards incompatible, regardless of what the docs say it should do, but we can make the change with no deprecation.
I could still use feedback on the rest, and I can address it all together. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Comments about changes to the tests copied from SciPy.
@@ -129,7 +129,7 @@ def __init__(self, | |||
|
|||
if hess is not None: | |||
self.obj_hess = hess | |||
if jac is None: | |||
if not jac: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When jac
is False
, the code currently treats it as though it were a callable.
@@ -522,7 +522,9 @@ def minimize_ipopt(fun, | |||
# Rename some default scipy options | |||
replace_option(options, b'disp', b'print_level') | |||
replace_option(options, b'maxiter', b'max_iter') | |||
if b'print_level' not in options: | |||
if getattr(options, 'print_level', False) is True: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When disp
is True
, we get an error because print_level
needs to be a bool.
@@ -31,6 +30,8 @@ class TestSLSQP: | |||
This example maximizes the function f(x) = 2*x*y + 2*x - x**2 - 2*y**2, | |||
which has a maximum at x=2, y=1. | |||
""" | |||
atol = 1e-7 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SLSQP passes these tests without a tolerance, but Ipopt currently needs one.
# Minimize, method='SLSQP': unbounded, approximated jacobian. | ||
jacs = [None, False, '2-point', '3-point'] | ||
# Minimize, method=None: unbounded, approximated jacobian. | ||
jacs = [None, False] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suppose if we want the interface to exactly match minimize
, we need to accept these strings. I don't think it's very important, personally.
constraints={'type': 'ineq', | ||
'fun': self.f_ieqcon2, | ||
'jac': self.fprime_ieqcon2}, | ||
options=self.opts) | ||
assert_(res['success'], res['message']) | ||
assert_allclose(res.x, [2, 1]) | ||
|
||
def test_minimize_bounded_constraint(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ipopt does not evaluate strictly within the bounds.
assert_(-0.8 <= res.x[0] <= 1) | ||
assert_(-1 <= res.x[1] <= 0.8) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These lines are not really deleted; tolerances are added.
assert_(-0.8 <= res.x[0] <= 1) | ||
assert_(-1 <= res.x[1] <= 0.8) | ||
|
||
# fmin_slsqp |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Delete all these fmin_slsqp
tests; we only want tests of the minimize
interface.
sol = minimize(f, [10], method='slsqp', constraints=cons_u) | ||
assert_(sol.success) | ||
assert_allclose(sol.x, 0, atol=1e-10) | ||
|
||
sol = minimize(f, [-10], method='slsqp', constraints=cons_l) | ||
assert_(sol.success) | ||
assert_allclose(sol.x, 2, atol=1e-10) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The changes below look substantial, but really I only changed method
to None
and removed the last two tests because Ipopt thinks they are infeasible.
options={'disp':False, 'maxiter':10000}) | ||
|
||
# The problem is infeasible, so it cannot succeed | ||
assert not res.success | ||
|
||
def test_parameters_stay_within_bounds(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IPOPT doesn't warn, so this test is not needed.
a1fe6b8
to
ea962a6
Compare
for grad in (prob.grad, False): | ||
if prob == list_of_problems[1]: # MaratosGradInFunc | ||
grad = True | ||
for hess in (None,): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can't use any of the Hessian options now because a) if hess
is provided, all the constraint Hessians must be provided and b) constraint Hessians can't be transferred from NonlinearConstraint
to the old constraint format (because the old constraint format in SciPy doesn't ever include constraint Hessians).
It seems possible to get around this, but that would be another PR.
assert_array_almost_equal(result1.x, prob.x_opt, decimal=5) | ||
assert_array_almost_equal(result2.x, prob.x_opt, decimal=5) | ||
|
||
def test_hessp(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No support for hessp
.
assert_array_almost_equal(result1.x, prob.x_opt, decimal=5) | ||
assert_array_almost_equal(result2.x, prob.x_opt, decimal=5) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't need to test L-BFGS-B here.
# Also check existence of the 'niter' attribute, for backward | ||
# compatibility | ||
assert_(result.get('niter', -1) == 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this was because trust-constr
used to have the wrong name for this attribute. In any case, not needed here.
assert_(result.get('success')) | ||
assert_(result.get('nit', -1) == 1) | ||
assert result.get('nit', -1) == 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess IPOPT solves this in 0 iterations?
@@ -689,22 +608,20 @@ def assert_inbounds(x): | |||
assert np.all(x <= bnds.ub) | |||
|
|||
def obj(x): | |||
assert_inbounds(x) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ipopt doesn't seem to respect bounds at every iteration.
@@ -713,10 +630,14 @@ def nci(x): | |||
|
|||
ref = minimize(fun=obj, x0=x0, method='slsqp', | |||
bounds=bnds, constraints=nlcs) | |||
assert_allclose(res.fun, ref.fun) | |||
assert ref.success | |||
assert res.fun <= ref.fun |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ipopt finds a much better solution than slsqp
, apparently.
@moorepants This looks like a crazy PR, but I don't think it's actually as bad as it looks. The vast majority of the lines are tests copy-pasted from SciPy. Almost all the changes to The next commit simply adds a test file from SciPy verbatim, so it probably doesn't need to be reviewed from scratch. The commit after that, 270af6a, makes the (bare minimum, for the most part) adjustments needed to make the tests pass. That would be worth reviewing. The tests caught some minor bugs, so there are a few corrections to I think that the only other commit that needs to be reviewed is the last one, 17e8a32.
Also, would you be willing to set a minimum supported SciPy version? It would be nice to simplify some of the import code, and perhaps we could test against some of the old versions. |
bounds
and support for Bounds
objectsbounds
and support for new-style constraints
@moorepants just thought I'd send a ping. I know this looks pretty extensive, but I wrote in #207 (comment) why it shouldn't be that bad to review. After this, I think I'm done with what I had planned to make Thanks for your help! |
I have not looked at this and don't recall the status. If it is ready for review I could have a look during August. It looks like a number of CI runs have failed though. |
I think it's ready for review because I'd like to know how you'd like to handle the failure produced by from scipy.optimize import rosen, Bounds
from cyipopt import minimize_ipopt
bounds = Bounds([4.0, 5.0], [4.0, 5.0])
minimize_ipopt(rosen, [-10, 8], bounds=bounds)
# message: b'Algorithm terminated successfully at a locally optimal point, satisfying the convergence tolerances (can be specified by options).'
# success: True
# status: 0
# fun: 12109.0
# x: [ 4.000e+00 5.000e+00]
# nit: 0
# info: status: 0
# x: [ 4.000e+00 5.000e+00]
# g: []
# obj_val: 12109.0
# mult_g: []
# mult_x_L: [ 1.761e+04 0.000e+00]
# mult_x_U: [ 0.000e+00 2.200e+03]
# status_msg: b'Algorithm terminated successfully at a locally optimal point, satisfying the convergence tolerances (can be specified by options).'
# nfev: 1
# njev: 3 Can you reproduce the error? I've pushed a commit that will help me figure out what's going on with the other test. Update: Looks like Ipopt just finds a different but equally good minimizer on some platforms. |
If you are copying lines from SciPy we need to include SciPy's license in this repo. I'm not sure if we discussed this before, but ideally we don't have to have two licenses in this code base. Are the copy paste lines from SciPy necessary? Can you not import from SciPy instead? |
Yes. My general approach has always been to support the SciPy version that shipped with the most recent Ubuntu LTS. It looks like SciPy 1.8.0 was with 22.04 LTS. |
x0 = 1 | ||
def f(x): | ||
return x**2 | ||
return x @ x |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Has this functionality changed?
In [3]: import numpy as np
In [4]: x = np.array([1])
In [5]: x @ x
Out[5]: 1
In [6]: type(x @ x)
Out[6]: numpy.int64
In [7]: type(x**2)
Out[7]: numpy.ndarray
In [8]: x**2
Out[8]: array([1])
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I changed this line just to generalize it to be a generic multivariate quadratic bowl rather than only working for scalars. It still works for scalars, though.
from cyipopt import minimize_ipopt
def f(x):
return x ** 2
minimize_ipopt(f, x0=1)
returning
fun: 0.0
info: {'x': array([0.]), 'g': array([], dtype=float64), 'obj_val': 0.0, 'mult_g': array([], dtype=float64), 'mult_x_L': array([0.]), 'mult_x_U': array([0.]), 'status': 0, 'status_msg': b'Algorithm terminated successfully at a locally optimal point, satisfying the convergence tolerances (can be specified by options).'}
message: b'Algorithm terminated successfully at a locally optimal point, satisfying the convergence tolerances (can be specified by options).'
nfev: 7
nit: 1
njev: 3
status: 0
success: True
x: array([0.])
One option for the SciPy license is that we do not ship these two test files with the released tarball. We could include the SciPy license in the test directory for those two SciPy (if you download from this repository) but we specifically do not include the files in the release and thus avoid having to add the SciPy license to the release. If possible, I think that is my preference. |
Yeah they needed slight modification. Pretty much all of those changes are in the commit TST: minimize_ipopt: make new tests pass.
Good call. The SciPy license is very permissive (New BSD) though. I can note which tests are copied from SciPy and include this license in the source file. Would you like me to do that? |
Great. I think we can make that a separate PR, though. |
cyipopt/scipy_interface.py
Outdated
con_jac = lambda x0, *args, **kwargs: optimize.approx_fprime( | ||
x0, con_fun, eps, *args, **kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
con_jac = lambda x0, *args, **kwargs: optimize.approx_fprime( | |
x0, con_fun, eps, *args, **kwargs) |
This looks like it snuck in with a merge or something.
@moorepants I think I've replied to your comments. It sounds like I need to:
|
Yes, this sounds good. Then we can exclude that single file in the |
Done. Does GitHub save artifacts from the doc build? I'd like to see how those links rendered. |
No, you have to build the docs yourself locally. |
I'm good with this and can merge. I'll wait to see if you want to check the docs render. Thanks for the addition! I'll work on a new release of cyipopt next. |
Sounds good. Merging. |
Followup to gh-206. Adds input validation for the
bounds
parameter ofminimize_ipopt
and support forscipy.optimize.Bounds
objects.