Estimagic

Latest version: v0.4.6

Safety actively analyzes 638452 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 5

0.1.3

- {gh}`195` Illustrate optimizers in documentation ({ghuser}`sofyaakimova`),
({ghuser}`effieHan`) and ({ghuser}`janosg`)
- {gh}`201` More stable covariance matrix calculation ({ghuser}`janosg`)
- {gh}`199` Return intermediate outputs of first_derivative ({ghuser}`timmens`)

0.1.2

- {gh}`189` Improve documentation and logging ({ghuser}`roecla`)

0.1.1

This release greatly expands the set of available optimization algorithms, has a better
and prettier dashboard and improves the documentation.

- {gh}`187` Implement dot notation in algo_options ({ghuser}`roecla`)
- {gh}`183` Improve documentation ({ghuser}`SofiaBadini`)
- {gh}`182` Allow for constraints in likelihood inference ({ghuser}`janosg`)
- {gh}`181` Add DF-OLS optimizer from Numerical Algorithm Group ({ghuser}`roecla`)
- {gh}`180` Add pybobyqa optimizer from Numerical Algorithm Group ({ghuser}`roecla`)
- {gh}`179` Allow base_steps and min_steps to be scalars ({ghuser}`tobiasraabe`)
- {gh}`178` Refactoring of dashboard code ({ghuser}`roecla`)
- {gh}`177` Add stride as a new dashboard argument ({ghuser}`roecla`)
- {gh}`176` Minor fix of plot width in dashboard ({ghuser}`janosg`)
- {gh}`174` Various dashboard improvements ({ghuser}`roecla`)
- {gh}`173` Add new color palettes and use them in dashboard ({ghuser}`janosg`)
- {gh}`172` Add high level log reading functions ({ghuser}`janosg`)

0.1.0dev1

This release entails a complete rewrite of the optimization code with many breaking
changes. In particular, some optimizers that were available before are not anymore.
Those will be re-introduced soon. The breaking changes include:

- The database is restructured. The new version simplifies the code,
makes logging faster and avoids the sql column limit.
- Users can provide closed form derivative and/or criterion_and_derivative where
the latter one can exploit synergies in the calculation of criterion and derivative.
This is also compatible with constraints.
- Our own (parallelized) first_derivative function is used to calculate gradients
during the optimization when no closed form gradients are provided.
- Optimizer options like convergence criteria and optimization results are harmonized
across optimizers.
- Users can choose from several batch evaluators whenever we parallelize
(e.g. for parallel optimizations or parallel function evaluations for numerical
derivatives) or pass in their own batch evaluator function as long as it has a
compatible interface. The batch evaluator interface also standardizes error handling.
- There is a well defined internal optimizer interface. Users can select the
pre-implemented optimizers by algorithm="name_of_optimizer" or their own optimizer
by algorithm=custom_minimize_function
- Optimizers from pygmo and nlopt are no longer supported (will be re-introduced)
- Greatly improved error handling.
- {gh}`169` Add additional dashboard arguments
- {gh}`168` Rename lower and upper to lower_bound and upper_bound
({ghuser}`ChristianZimpelmann`)
- {gh}`167` Improve dashboard styling ({ghuser}`roecla`)
- {gh}`166` Re-add POUNDERS from TAO ({ghuser}`tobiasraabe`)
- {gh}`165` Re-add the scipy optimizers with harmonized options ({ghuser}`roecla`)
- {gh}`164` Closed form derivatives for parameter transformations ({ghuser}`timmens`)
- {gh}`163` Complete rewrite of optimization with breaking changes ({ghuser}`janosg`)
- {gh}`162` Improve packaging and relax version constraints ({ghuser}`tobiasraabe`)
- {gh}`160` Generate parameter tables in tex and html ({ghuser}`mpetrosian`)

0.0.31

- {gh}`130` Improve wrapping of POUNDERS algorithm ({ghuser}`mo2561057`)
- {gh}`159` Add Richardson Extrapolation to first_derivative ({ghuser}`timmens`)

0.0.30

- {gh}`158` allows to specify a gradient in maximize and minimize ({ghuser}`janosg`)

Page 4 of 5

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.