Releases: optimagic-dev/optimagic
v0.5.1
Summary
This is a minor release that introduces the new algorithm selection tool and several
small improvements.
To learn more about the algorithm selection feature check out the following resources:
Pull Requests
- #549 Add support for Python 3.13 (@timmens)
- #550 and #534 implement the new algorithm selection tool (@janosg)
- #548 and #531 improve the documentation (@ChristianZimpelmann)
- #544 Adjusts the results processing of the nag optimizers to be compatible
with the latest releases (@timmens) - #543 Adds support for numpy 2.x (@timmens)
- #536 Adds a how-to guide for choosing local optimizers (@mpetrosian)
- #535 Allows algorithm classes and instances in estimation functions
(@timmens) - #532 Makes several small improvements to the documentation (@janosg)
v0.5.0
Summary
This is a major release with several breaking changes and deprecations. In this
release we started implementing two major enhancement proposals and renamed the package
from estimagic to optimagic (while keeping the estimagic
namespace for the estimation
capabilities).
The implementation of the two enhancement proposals is not complete and will likely
take until version 0.6.0
. However, all breaking changes and deprecations (with the
exception of a minor change in benchmarking) are already implemented such that updating
to version 0.5.0
is future proof.
Pull Requests
- #500 removes the dashboard, the support for simopt optimizers and the
derivative_plot
(@janosg) - #502 renames estimagic to optimagic (@janosg)
- #504 aligns
maximize
andminimize
more closely with scipy. All related
deprecations and breaking changes are listed below. As a result, scipy code that uses
minimize with the argumentsx0
,fun
,jac
andmethod
will run without changes
in optimagic. Similarly, toOptimizeResult
gets some aliases so it behaves more
like SciPy's. - #506 introduces the new
Bounds
object and deprecateslower_bounds
,
upper_bounds
,soft_lower_bounds
andsoft_upper_bounds
(@janosg) - #507 updates the infrastructure so we can make parallel releases under the names
optimagic
andestimagic
(@timmens) - #508 introduces the new
ScalingOptions
object and deprecates the
scaling_options
argument ofmaximize
andminimize
(@timmens) - #512 implements the new interface for objective functions and derivatives
(@janosg) - #513 implements the new
optimagic.MultistartOptions
object and deprecates the
multistart_options
argument ofmaximize
andminimize
(@timmens) - #514 and #516 introduce the
NumdiffResult
object that is returned from
first_derivative
andsecond_derivative
. It also fixes several bugs in the
pytree handling infirst_derivative
andsecond_derivative
and deprecates
Richardson Extrapolation and thekey
(@timmens) - #517 introduces the new
NumdiffOptions
object for configuring numerical
differentiation during optimization or estimation (@timmens) - #519 rewrites the logging code and introduces new
LogOptions
objects
(@schroedk) - #521 introduces the new internal algorithm interface.
(@janosg and @mpetrosian) - #522 introduces the new
Constraint
objects and deprecates passing
dictionaries or lists of dictionaries as constraints (@timmens)
Breaking changes
- When providing a path for the argument
logging
of the functions
maximize
andminimize
and the file already exists, the default
behavior is to raise an error now. Replacement or extension
of an existing file must be explicitly configured. - The argument
if_table_exists
inlog_options
has no effect anymore and a
corresponding warning is raised. OptimizeResult.history
is now aoptimagic.History
object instead of a
dictionary. Dictionary style access is implemented but deprecated. Other dictionary
methods might not work.- The result of
first_derivative
andsecond_derivative
is now a
optimagic.NumdiffResult
object instead of a dictionary. Dictionary style access is
implemented but other dictionary methods might not work. - The dashboard is removed
- The
derivative_plot
is removed. - Optimizers from Simopt are removed.
- Passing callables with the old internal algorithm interface as
algorithm
to
minimize
andmaximize
is not supported anymore. Use the new
Algorithm
objects instead. For examples see: https://tinyurl.com/24a5cner
Deprecations
- The
criterion
argument ofmaximize
andminimize
is renamed tofun
(as in
SciPy). - The
derivative
argument ofmaximize
andminimize
is renamed tojac
(as
in SciPy) - The
criterion_and_derivative
argument ofmaximize
andminimize
is renamed
tofun_and_jac
to align it with the other names. - The
criterion_kwargs
argument ofmaximize
andminimize
is renamed to
fun_kwargs
to align it with the other names. - The
derivative_kwargs
argument ofmaximize
andminimize
is renamed to
jac_kwargs
to align it with the other names. - The
criterion_and_derivative_kwargs
argument ofmaximize
andminimize
is
renamed tofun_and_jac_kwargs
to align it with the other names. - Algorithm specific convergence and stopping criteria are renamed to align them more
with NlOpt and SciPy names.convergence_relative_criterion_tolerance
->convergence_ftol_rel
convergence_absolute_criterion_tolerance
->convergence_ftol_abs
convergence_relative_params_tolerance
->convergence_xtol_rel
convergence_absolute_params_tolerance
->convergence_xtol_abs
convergence_relative_gradient_tolerance
->convergence_gtol_rel
convergence_absolute_gradient_tolerance
->convergence_gtol_abs
convergence_scaled_gradient_tolerance
->convergence_gtol_scaled
stopping_max_criterion_evaluations
->stopping_maxfun
stopping_max_iterations
->stopping_maxiter
- The arguments
lower_bounds
,upper_bounds
,soft_lower_bounds
and
soft_upper_bounds
are deprecated and replaced byoptimagic.Bounds
. This affects
maximize
,minimize
,estimate_ml
,estimate_msm
,slice_plot
and several
other functions. - The
log_options
argument ofminimize
andmaximize
is deprecated. Instead,
LogOptions
objects can be passed under thelogging
argument. - The class
OptimizeLogReader
is deprecated and redirects to
SQLiteLogReader
. - The
scaling_options
argument ofmaximize
andminimize
is deprecated. Instead a
ScalingOptions
object can be passed under thescaling
argument that was previously
just a bool. - Objective functions that return a dictionary with the special keys "value",
"contributions" and "root_contributions" are deprecated. Instead, likelihood and
least-squares functions are marked with amark.likelihood
ormark.least_squares
decorator. There is a detailed how-to guide that shows the new behavior. This affects
maximize
,minimize
,slice_plot
and other functions that work with objective
functions. - The
multistart_options
argument ofminimize
andmaximize
is deprecated. Instead,
aMultistartOptions
object can be passed under themultistart
argument. - Richardson Extrapolation is deprecated in
first_derivative
andsecond_derivative
- The
key
argument is deprecated infirst_derivative
andsecond_derivative
- Passing dictionaries or lists of dictionaries as
constraints
tomaximize
or
minimize
is deprecated. Use the newConstraint
objects instead.
v0.5.0rc2
v0.5.0rc1
First release candidate for version 0.5.0
Summary
This is a major release with several breaking changes and deprecations. In this
release we started implementing two major enhancement proposals and renamed the package
from estimagic to optimagic (while keeping the estimagic
namespace for the estimation
capabilities).
The implementation of the two enhancement proposals is not complete and will likely
take until version 0.6.0
. However, all breaking changes and deprecations (with the
exception of a minor change in benchmarking) are already implemented such that updating
to version 0.5.0
is future proof.
Pull Requests
- #500 removes the dashboard, the support for simopt optimizers and the
derivative_plot
(@janosg) - #502 renames estimagic to optimagic (@janosg)
- #504 aligns
maximize
andminimize
more closely with scipy. All related
deprecations and breaking changes are listed below. As a result, scipy code that uses
minimize with the argumentsx0
,fun
,jac
andmethod
will run without changes
in optimagic. Similarly, toOptimizeResult
gets some aliases so it behaves more
like SciPy's. - #506 introduces the new
Bounds
object and deprecateslower_bounds
,
upper_bounds
,soft_lower_bounds
andsoft_upper_bounds
(@janosg) - #507 updates the infrastructure so we can make parallel releases under the names
optimagic
andestimagic
(@timmens) - #508 introduces the new
ScalingOptions
object and deprecates the
scaling_options
argument ofmaximize
andminimize
(@timmens) - #512 implements the new interface for objective functions and derivatives
(@janosg) - #513 implements the new
optimagic.MultistartOptions
object and deprecates the
multistart_options
argument ofmaximize
andminimize
(@timmens) - #514 and #516 introduce the
NumdiffResult
object that is returned from
first_derivative
andsecond_derivative
. It also fixes several bugs in the
pytree handling infirst_derivative
andsecond_derivative
and deprecates
Richardson Extrapolation and thekey
(@timmens) - #517 introduces the new
NumdiffOptions
object for configuring numerical
differentiation during optimization or estimation (@timmens) - #519 rewrites the logging code and introduces new
LogOptions
objects
({ghuser}schroedk
) - #521 introduces the new internal algorithm interface.
(@janosg and @mpetrosian) - #522 introduces the new
Constraint
objects and deprecates passing
dictionaries or lists of dictionaries as constraints (@timmens)
Breaking changes
- When providing a path for the argument
logging
of the functions
maximize
andminimize
and the file already exists, the default
behavior is to raise an error now. Replacement or extension
of an existing file must be explicitly configured. - The argument
if_table_exists
inlog_options
has no effect anymore and a
corresponding warning is raised. OptimizeResult.history
is now aoptimagic.History
object instead of a
dictionary. Dictionary style access is implemented but deprecated. Other dictionary
methods might not work.- The result of
first_derivative
andsecond_derivative
is now a
optimagic.NumdiffResult
object instead of a dictionary. Dictionary style access is
implemented but other dictionary methods might not work. - The dashboard is removed
- The
derivative_plot
is removed. - Optimizers from Simopt are removed.
- Passing callables with the old internal algorithm interface as
algorithm
to
minimize
andmaximize
is not supported anymore. Use the new
Algorithm
objects instead. For examples see: https://tinyurl.com/24a5cner
Deprecations
- The
criterion
argument ofmaximize
andminimize
is renamed tofun
(as in
SciPy). - The
derivative
argument ofmaximize
andminimize
is renamed tojac
(as
in SciPy) - The
criterion_and_derivative
argument ofmaximize
andminimize
is renamed
tofun_and_jac
to align it with the other names. - The
criterion_kwargs
argument ofmaximize
andminimize
is renamed to
fun_kwargs
to align it with the other names. - The
derivative_kwargs
argument ofmaximize
andminimize
is renamed to
jac_kwargs
to align it with the other names. - The
criterion_and_derivative_kwargs
argument ofmaximize
andminimize
is
renamed tofun_and_jac_kwargs
to align it with the other names. - Algorithm specific convergence and stopping criteria are renamed to align them more
with NlOpt and SciPy names.convergence_relative_criterion_tolerance
->convergence_ftol_rel
convergence_absolute_criterion_tolerance
->convergence_ftol_abs
convergence_relative_params_tolerance
->convergence_xtol_rel
convergence_absolute_params_tolerance
->convergence_xtol_abs
convergence_relative_gradient_tolerance
->convergence_gtol_rel
convergence_absolute_gradient_tolerance
->convergence_gtol_abs
convergence_scaled_gradient_tolerance
->convergence_gtol_scaled
stopping_max_criterion_evaluations
->stopping_maxfun
stopping_max_iterations
->stopping_maxiter
- The arguments
lower_bounds
,upper_bounds
,soft_lower_bounds
and
soft_upper_bounds
are deprecated and replaced byoptimagic.Bounds
. This affects
maximize
,minimize
,estimate_ml
,estimate_msm
,slice_plot
and several
other functions. - The
log_options
argument ofminimize
andmaximize
is deprecated. Instead,
LogOptions
objects can be passed under thelogging
argument. - The class
OptimizeLogReader
is deprecated and redirects to
SQLiteLogReader
. - The
scaling_options
argument ofmaximize
andminimize
is deprecated. Instead a
ScalingOptions
object can be passed under thescaling
argument that was previously
just a bool. - Objective functions that return a dictionary with the special keys "value",
"contributions" and "root_contributions" are deprecated. Instead, likelihood and
least-squares functions are marked with amark.likelihood
ormark.least_squares
decorator. There is a detailed how-to guide that shows the new behavior. This affects
maximize
,minimize
,slice_plot
and other functions that work with objective
functions. - The
multistart_options
argument ofminimize
andmaximize
is deprecated. Instead,
aMultistartOptions
object can be passed under themultistart
argument. - Richardson Extrapolation is deprecated in
first_derivative
andsecond_derivative
- The
key
argument is deprecated infirst_derivative
andsecond_derivative
- Passing dictionaries or lists of dictionaries as
constraints
tomaximize
or
minimize
is deprecated. Use the newConstraint
objects instead.
v0.4.7
v0.4.7
This release contains minor improvements and bug fixes. It is the last release before
the package will be renamed to optimagic and two large enhancement proposals will be
implemented.
- #490 adds the attribute
optimize_result
to theMomentsResult
class (@timmens) - #483 fixes a bug in the handling of keyword arguments in
bootstrap
(@alanlujan91) - #477 allows to use an identity weighting matrix in MSM estimation (@sidd3888)
- #473 fixes a bug where bootstrap keyword arguments were ignored
get_moments_cov
(@timmens) - #467, #478, #479 and #480 improve the documentation (@mpetrosian, @segsell, and @timmens)
v0.4.6
This release drastically improves the optimizer benchmarking capabilities, especially
with noisy functions and parallel optimizers. It makes tranquilo and numba optional
dependencies and is the first version of estimagic to be compatible with Python
3.11.
- #464 Makes tranquilo and numba optional dependencies (@janosg)
- #461 Updates docstrings for procss_benchmark_results (@segsell)
- #460 Fixes several bugs in the processing of benchmark results with noisy
functions (@janosg) - #459 Prepares benchmarking functionality for parallel optimizers
(@mpetrosian and @janosg) - #457 Removes some unused files (@segsell)
- #455 Improves a local pre-commit hook (@ChristianZimpelmann)
v0.4.5
- #379 Improves the estimation table (@ChristianZimpelmann)
- #445 fixes line endings in local pre-commit hook (@ChristianZimpelmann)
- #443, #444, #445, #446, #448 and #449 are a major
refactoring of tranquilo (@timmens and @janosg) - #441 Adds an aggregated convergence plot for benchmarks (@mpetrosian)
- #435 Completes the cartis-roberts benchmark set (@segsell)
v0.4.4
- #437 removes fuzzywuzzy as dependency (@aidatak97)
- #432 makes logging compatible with sqlalchemy 2.x (@janosg)
- #430 refactors the getter functions in Tranquilo (@janosg)
- #427 improves pre-commit setup (@timmens and @hmgaudecker)
- #425 improves handling of notebooks in documentation (@baharcos)
- #423 and #399 add code to calculate poisdeness constants (@segsell)
- #420 improve CI infrastructure (@hmgaudecker, @janosg)
- #407 adds global optimizers from scipy (@baharcos)
v0.4.3
v0.4.2
This realease contains a bugfix and several improvements
If you have used multistart optimizations with a least squares optimizer you should update as quickly as possible.
#412 Improves the output of the fides optimizer among other small changes (@janosg)
#411 Fixes a bug in multistart optimizations with least squares optimizers. See #410 for details (@janosg)
#404 speeds up the gqtpar subsolver (@mpetrosian )
#400 refactors subsolvers (@mpetrosian)
#398, #397, #395, #390, #389, #388 continue with the implementation of tranquilo (@segsell, @timmens, @mpetrosian, @janosg)
#391 speeds up the bntr subsolver