# Changes#

This is a record of all past optimagic releases and what went into them in reverse chronological order. We follow semantic versioning and all releases are available on Anaconda.org.

Following the scientific python guidelines we drop the official support for Python 3.9.

## 0.5.0#

This is a major release with several breaking changes and deprecations. In this
release we started implementing two major enhancement proposals and renamed the package
from estimagic to optimagic (while keeping the `estimagic`

namespace for the estimation
capabilities).

The implementation of the two enhancement proposals is not complete and will likely
take until version `0.6.0`

. However, all breaking changes and deprecations (with the
exception of a minor change in benchmarking) are already implemented such that updating
to version `0.5.0`

is future proof.

#500 removes the dashboard, the support for simopt optimizers and the

`derivative_plot`

(@janosg)#504 aligns

`maximize`

and`minimize`

more closely with scipy. All related deprecations and breaking changes are listed below. As a result, scipy code that uses minimize with the arguments`x0`

,`fun`

,`jac`

and`method`

will run without changes in optimagic. Similarly, to`OptimizeResult`

gets some aliases so it behaves more like SciPy’s.#506 introduces the new

`Bounds`

object and deprecates`lower_bounds`

,`upper_bounds`

,`soft_lower_bounds`

and`soft_upper_bounds`

(@janosg)#507 updates the infrastructure so we can make parallel releases under the names

`optimagic`

and`estimagic`

(@timmens)#508 introduces the new

`ScalingOptions`

object and deprecates the`scaling_options`

argument of`maximize`

and`minimize`

(@timmens)#512 implements the new interface for objective functions and derivatives (@janosg)

#513 implements the new

`optimagic.MultistartOptions`

object and deprecates the`multistart_options`

argument of`maximize`

and`minimize`

(@timmens)#514 and #516 introduce the

`NumdiffResult`

object that is returned from`first_derivative`

and`second_derivative`

. It also fixes several bugs in the pytree handling in`first_derivative`

and`second_derivative`

and deprecates Richardson Extrapolation and the`key`

(@timmens)#517 introduces the new

`NumdiffOptions`

object for configuring numerical differentiation during optimization or estimation (@timmens)#519 rewrites the logging code and introduces new

`LogOptions`

objects (@schroedk)#521 introduces the new internal algorithm interface. (@janosg and @mpetrosian)

#522 introduces the new

`Constraint`

objects and deprecates passing dictionaries or lists of dictionaries as constraints (@timmens)

### Breaking changes#

When providing a path for the argument

`logging`

of the functions`maximize`

and`minimize`

and the file already exists, the default behavior is to raise an error now. Replacement or extension of an existing file must be explicitly configured.The argument

`if_table_exists`

in`log_options`

has no effect anymore and a corresponding warning is raised.`OptimizeResult.history`

is now a`optimagic.History`

object instead of a dictionary. Dictionary style access is implemented but deprecated. Other dictionary methods might not work.The result of

`first_derivative`

and`second_derivative`

is now a`optimagic.NumdiffResult`

object instead of a dictionary. Dictionary style access is implemented but other dictionary methods might not work.The dashboard is removed

The

`derivative_plot`

is removed.Optimizers from Simopt are removed.

Passing callables with the old internal algorithm interface as

`algorithm`

to`minimize`

and`maximize`

is not supported anymore. Use the new`Algorithm`

objects instead. For examples see: https://tinyurl.com/24a5cner

### Deprecations#

The

`criterion`

argument of`maximize`

and`minimize`

is renamed to`fun`

(as in SciPy).The

`derivative`

argument of`maximize`

and`minimize`

is renamed to`jac`

(as in SciPy)The

`criterion_and_derivative`

argument of`maximize`

and`minimize`

is renamed to`fun_and_jac`

to align it with the other names.The

`criterion_kwargs`

argument of`maximize`

and`minimize`

is renamed to`fun_kwargs`

to align it with the other names.The

`derivative_kwargs`

argument of`maximize`

and`minimize`

is renamed to`jac_kwargs`

to align it with the other names.The

`criterion_and_derivative_kwargs`

argument of`maximize`

and`minimize`

is renamed to`fun_and_jac_kwargs`

to align it with the other names.Algorithm specific convergence and stopping criteria are renamed to align them more with NlOpt and SciPy names.

`convergence_relative_criterion_tolerance`

->`convergence_ftol_rel`

`convergence_absolute_criterion_tolerance`

->`convergence_ftol_abs`

`convergence_relative_params_tolerance`

->`convergence_xtol_rel`

`convergence_absolute_params_tolerance`

->`convergence_xtol_abs`

`convergence_relative_gradient_tolerance`

->`convergence_gtol_rel`

`convergence_absolute_gradient_tolerance`

->`convergence_gtol_abs`

`convergence_scaled_gradient_tolerance`

->`convergence_gtol_scaled`

`stopping_max_criterion_evaluations`

->`stopping_maxfun`

`stopping_max_iterations`

->`stopping_maxiter`

The arguments

`lower_bounds`

,`upper_bounds`

,`soft_lower_bounds`

and`soft_upper_bounds`

are deprecated and replaced by`optimagic.Bounds`

. This affects`maximize`

,`minimize`

,`estimate_ml`

,`estimate_msm`

,`slice_plot`

and several other functions.The

`log_options`

argument of`minimize`

and`maximize`

is deprecated. Instead,`LogOptions`

objects can be passed under the`logging`

argument.The class

`OptimizeLogReader`

is deprecated and redirects to`SQLiteLogReader`

.The

`scaling_options`

argument of`maximize`

and`minimize`

is deprecated. Instead a`ScalingOptions`

object can be passed under the`scaling`

argument that was previously just a bool.Objective functions that return a dictionary with the special keys “value”, “contributions” and “root_contributions” are deprecated. Instead, likelihood and least-squares functions are marked with a

`mark.likelihood`

or`mark.least_squares`

decorator. There is a detailed how-to guide that shows the new behavior. This affects`maximize`

,`minimize`

,`slice_plot`

and other functions that work with objective functions.The

`multistart_options`

argument of`minimize`

and`maximize`

is deprecated. Instead, a`MultistartOptions`

object can be passed under the`multistart`

argument.Richardson Extrapolation is deprecated in

`first_derivative`

and`second_derivative`

The

`key`

argument is deprecated in`first_derivative`

and`second_derivative`

Passing dictionaries or lists of dictionaries as

`constraints`

to`maximize`

or`minimize`

is deprecated. Use the new`Constraint`

objects instead.

## 0.4.7#

This release contains minor improvements and bug fixes. It is the last release before the package will be renamed to optimagic and two large enhancement proposals will be implemented.

#490 adds the attribute

`optimize_result`

to the`MomentsResult`

class (@timmens)#483 fixes a bug in the handling of keyword arguments in

`bootstrap`

(@alanlujan91)#477 allows to use an identity weighting matrix in MSM estimation (@sidd3888)

#473 fixes a bug where bootstrap keyword arguments were ignored

`get_moments_cov`

(@timmens)#467, #478, #479 and #480 improve the documentation (@mpetrosian, @segsell, and @timmens)

## 0.4.6#

This release drastically improves the optimizer benchmarking capabilities, especially with noisy functions and parallel optimizers. It makes tranquilo and numba optional dependencies and is the first version of estimagic to be compatible with Python 3.11.

#464 Makes tranquilo and numba optional dependencies (@janosg)

#461 Updates docstrings for procss_benchmark_results (@segsell)

#460 Fixes several bugs in the processing of benchmark results with noisy functions (@janosg)

#459 Prepares benchmarking functionality for parallel optimizers (@mpetrosian and @janosg)

#455 Improves a local pre-commit hook (@ChristianZimpelmann)

## 0.4.5#

#379 Improves the estimation table (@ChristianZimpelmann)

#445 fixes line endings in local pre-commit hook (@ChristianZimpelmann)

#443, #444, #445, #446, #448 and #449 are a major refactoring of tranquilo (@timmens and @janosg)

#441 Adds an aggregated convergence plot for benchmarks (@mpetrosian)

## 0.4.4#

#437 removes fuzzywuzzy as dependency (@aidatak97)

#427 improves pre-commit setup (@timmens and @hmgaudecker)

#425 improves handling of notebooks in documentation (@baharcos)

#423 and #399 add code to calculate poisdeness constants (@segsell)

#420 improve CI infrastructure (@hmgaudecker, @janosg)

## 0.4.3#

## 0.4.2#

#412 Improves the output of the fides optimizer among other small changes (@janosg)

#411 Fixes a bug in multistart optimizations with least squares optimizers. See #410 for details (@janosg)

#404 speeds up the gqtpar subsolver (@mpetrosian )

#400 refactors subsolvers (@mpetrosian)

#398, #397, #395, #390, #389, #388 continue with the implementation of tranquilo (@segsell, @timmens, @mpetrosian, @janosg)

#391 speeds up the bntr subsolver (@mpetrosian)

## 0.4.1#

#307 Adopts a code of condact and governance model

#384 Polish documentation (@janosg and @mpetrosian)

#294 Adds the very first experimental version of tranquilo (@janosg, @timmens, @segsell, @mpetrosian)

## 0.4.0#

## 0.3.4#

## 0.3.3#

## 0.3.2#

## 0.3.1#

#349 fixes multiple small bugs and adds test cases for all of them (@mpetrosian, @janosg and @timmens)

## 0.3.0#

Fist release with pytree support in optimization, estimation and differentiation and much better result objects in optimization and estimation.

Breaking changes

New

`OptimizeResult`

object is returned by`maximize`

and`minimize`

. This breaks all code that expects the old result dictionary. Usage of the new result is explained in the getting started tutorial on optimization.New internal optimizer interface that can break optimization with custom optimizers

The inferface of

`process_constraints`

changed quite drastically. This breaks code that used`process_constraints`

to get the number of free parameters or check if constraints are valid. There are new high level functions`estimagic.check_constraints`

and`estimagic.count_free_params`

instead.Some functions from

`estimagic.logging.read_log`

are removed and replaced by`estimagic.OptimizeLogReader`

.Convenience functions to create namedtuples are removed from

`estimagic.utilities`

.#345 Moves estimation_table to new latex functionality of pandas (@mpetrosian)

#343 Improves the result object of estimation functions and makes msm estimation pytree compatible (@janosg)

#342 Improves default options of the fides optimizer, allows single constraints and polishes the documentation (@janosg)

#340 Enables history collection for optimizers that evaluate the criterion function in parallel (@janosg)

#339 Incorporates user feedback and polishes the documentation.

#335 Introduces an

`OptimizeResult`

object and functionality for history plotting (@janosg).#333 Uses new history collection feature to speed up benchmarking (@segsell).

#328 Improves quadratic surrogate solvers used in pounders and tranquilo (@segsell).

#326 Improves documentation of numerical derivatives (@timmens).

#325 Improves the slice_plot (@mpetrosian)

#324 Adds ability to collect optimization histories without logging (@janosg).

#311 and #288 rewrite all plotting code in plotly (@timmens and @aidatak97).

#306 improves quadratic surrogate solvers used in pounders and tranquilo (@segsell).

#305 allows pytrees during optimization and rewrites large parts of the constraints processing (@janosg).

#303 introduces a new optimizer interface that makes it easier to add optimizers and makes it possible to access optimizer specific information outside of the intrenal_criterion_and_derivative (@janosg and @roecla).

## 0.2.5#

## 0.2.4#

## 0.2.3#

#295 Fixes a small bug in estimation_table (@mpetrosian).

#286 Adds pytree support for first and second derivative (@timmens).

#285 Allows to use estimation functions with external optimization (@janosg).

#283 Adds fast solvers for quadratic trustregion subproblems (@segsell).

#282 Vastly improves estimation tables (@mpetrosian).

#281 Adds some tools to work with pytrees (@janosg and @timmens).

#278 adds Estimagic Enhancement Proposal 1 for the use of Pytrees in Estimagic (@janosg)

## 0.2.2#

## 0.2.1#

## 0.2.0#

Add a lot of new functionality with a few minor breaking changes. We have more
optimizers, better error handling, bootstrap and inference for method of simulated
moments. The breaking changes are:
- logging is disabled by default during optimization.
- the log_option “if_exists” was renamed to “if_table_exists”
- The comparison plot function is removed.
- first_derivative now returns a dictionary, independent of arguments.
- structure of the logging database has changed
- there is an additional boolean flag named `scaling`

in minimize and maximize

#251 Allows the loading, running and visualization of benchmarks (@janosg, @mpetrosian and @roecla)

#196 Adds support for multistart optimizations (@asouther4 and @janosg)

#146 Adds

`estimate_ml`

functionality (@janosg, @LuisCald and @s6soverd).#215 Adds optimizers from the pygmo library (@roecla and @janosg)

#212 Adds optimizers from the nlopt library (@mpetrosian)

#228 Restructures testing and makes changes to log_options.

#219 Several enhancements by (@tobiasraabe)

#218 Improve documentation by (@sofyaakimova) and (@effieHan)

#214 Fix bug with overlapping “fixed” and “linear” constraints (@janosg)

#211 Improve error handling of log reading functions by (@janosg)

#148 Add bootstrap functionality (@RobinMusolff)

#206 Improve latex and html tables (@mpetrosian)

#205 Add scipy’s least squares optimizers (based on #197 by (@yradeva93)

#198 More unit tests for optimizers (@mchandra12)

#200 Plot intermediate outputs of

`first_derivative`

(@timmens)

## 0.1.3 - 2021-06-25#

## 0.1.2 - 2021-02-07#

## 0.1.1 - 2021-01-13#

This release greatly expands the set of available optimization algorithms, has a better and prettier dashboard and improves the documentation.

#183 Improve documentation (@SofiaBadini)

#182 Allow for constraints in likelihood inference (@janosg)

#181 Add DF-OLS optimizer from Numerical Algorithm Group (@roecla)

#180 Add pybobyqa optimizer from Numerical Algorithm Group (@roecla)

#179 Allow base_steps and min_steps to be scalars (@tobiasraabe)

#173 Add new color palettes and use them in dashboard (@janosg)

## 0.1.0dev1 - 2020-09-08#

This release entails a complete rewrite of the optimization code with many breaking changes. In particular, some optimizers that were available before are not anymore. Those will be re-introduced soon. The breaking changes include:

The database is restructured. The new version simplifies the code, makes logging faster and avoids the sql column limit.

Users can provide closed form derivative and/or criterion_and_derivative where the latter one can exploit synergies in the calculation of criterion and derivative. This is also compatible with constraints.

Our own (parallelized) first_derivative function is used to calculate gradients during the optimization when no closed form gradients are provided.

Optimizer options like convergence criteria and optimization results are harmonized across optimizers.

Users can choose from several batch evaluators whenever we parallelize (e.g. for parallel optimizations or parallel function evaluations for numerical derivatives) or pass in their own batch evaluator function as long as it has a compatible interface. The batch evaluator interface also standardizes error handling.

There is a well defined internal optimizer interface. Users can select the pre-implemented optimizers by algorithm=”name_of_optimizer” or their own optimizer by algorithm=custom_minimize_function

Optimizers from pygmo and nlopt are no longer supported (will be re-introduced)

Greatly improved error handling.

#169 Add additional dashboard arguments

#168 Rename lower and upper to lower_bound and upper_bound (@ChristianZimpelmann)

#166 Re-add POUNDERS from TAO (@tobiasraabe)

#165 Re-add the scipy optimizers with harmonized options (@roecla)

#164 Closed form derivatives for parameter transformations (@timmens)

#163 Complete rewrite of optimization with breaking changes (@janosg)

#162 Improve packaging and relax version constraints (@tobiasraabe)

#160 Generate parameter tables in tex and html (@mpetrosian)

## 0.0.31 - 2020-06-20#

#130 Improve wrapping of POUNDERS algorithm (@mo2561057)

#159 Add Richardson Extrapolation to first_derivative (@timmens)

## 0.0.30 - 2020-04-22#

## 0.0.29 - 2020-04-16#

#153 adds documentation for the CLI (@tobiasraabe)

#152 makes estimagic work with pandas 1.0 (@SofiaBadini)

## 0.0.28 - 2020-03-17#

#150 adds command line interface to the dashboard (@tobiasraabe)