-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FEAT: Add the Nelder-Mead algorithm #441
Conversation
3e5feb0
to
3dbc5cf
Compare
I agree with having a common interface which can be used for either scalar or multivariate Maybe we could do something similar to scipy and have both a common interface to the function and individual functions, ie. |
But if we don't decide for go for a common interface for now, I think we should call this function something more like |
- Jitting is commented out for generating coverage report
3dbc5cf
to
cee14b8
Compare
I've tried adding this to the lecture but don't seem to get convergence.. here is a look at the output from this function (stored in Here is the piece of code just in case you spot something wrong with it: Also why is the bounds parameter set to have a shape of two by default? Is this needed? Thanks |
The bounds parameter is set this way to get Numba to work without using |
Is that an issue for maximising with more than 2 arguments? Should we have something that automatically detects the size of params and creates the bounds? |
Here's the notebook: https://gist.github.com/natashawatkins/807eb4e19fd86ae1ebc8f6dcadd530fb |
It looks like the problem is coming from the algorithm. Both my version and Scipy often seem to cycle. I can sometimes get convergence by tweaking the parameters depending on the |
So in the live lecture at the moment, it's using the COBYLA method - do you think we should implement this one in the QuantEcon library? |
Actually, after checking the live lecture, the problem seems to be that the constraints are missing. By passing |
Is there a way we can add a constraints argument to the function? Also are you finding it quite slow? It seems to be a lot slower than the 'brute force' method |
Hey @natashawatkins , |
@chrishyland Thanks for commenting on this. From your notebook, I see how you've implemented box constraints such as |
Hi @QBatista , |
@jstac @natashawatkins Having a common interface for the Nelder-Mead algorithm and One option would be to follow Scipy, which uses two different interfaces, one dedicated to function minimization with only one variable (called Another option would be to make the initial guess argument optional and raise an error message if the method used requires an initial guess. Scipy uses a similar idea for its L-BFGS-B method in the In terms of maintenance, the first option would have a more straightforward implementation, especially since we want the interface to be jitted, and therefore should be easier to maintain. @oyamad Please let me know if you have any suggestions for designing this interface. |
Thanks @QBatista In option one it's not clear from your discussion how the initial guess issue is resolved. The distinction in SciPy is between multivariate and univariate routines, is it not? Can you be a bit more specific about the proposed interface? |
@jstac In Scipy, the methods that do not require an initial guess are methods which can be exclusively applied to functions with only one variable, and therefore, they all fall in |
Thanks for the explanation. I guess something analogous would be OK for us. |
Should we just stick to having separate functions for different algorithms? What was the issue regarding generated jit again? |
That's another option -- as long as we don't have too many methods, we could simply have a set of @natashawatkins |
I don't like the idea of having I think we could just call it Doesn't this get rid of the |
Yes, that's fine too. Just one thing to note: we currently also have root finding methods in |
sorry I meant |
I'm in favor of @natashawatkins's suggestion. |
Sounds good -- I'll make the changes soon. |
2c421b5
to
09016f9
Compare
This is now ready for review. |
Cleverly written, well documented, excellent tests. Great job @QBatista. |
thanks @QBatista this is looking good. I will attach a |
Once this is merged -- I will issue a new release via PyPI |
@mmcky are you releasing a new version of the library? I might use this in a lecture |
Adds a jitted implementation of the Nelder-Mead algorithm for multivariate optimization.
The two main references are Lagarias et. al (1998) for the algorithm and Singer and Singer (2004) for efficiency.
It might be a good idea to have a common interface for
scalar_maximization.py
and this multivariate version. Please let me know if you have any suggestions for setting up this interface and choosingnames.
Performance
Close #419