optimagic wants to enable domain experts to solve difficult nonlinear optimization problems without having to become algorithm experts. Since there is no universally efficient optimizer, this requires access to many different algorithms as well as diagnostic tools to compare how well an algorithm works on a given problem.
To achieve this, we wrap optimizers from SciPy, NLopt, Pygmo, and many other packages
and give them a unified and familiar minimize
interface. Since our minimize
function
is a superset of SciPy’s, the switch to optimagic is effortless for many users.
Besides a seamless switch between different optimizers, optimagic provides the following features:
- Plotting tools to visualize the history of one or several optimizers on a given problem.
- Parameters don’t have to be flat arrays. Use (nested) dictionaries, NamedTuples or whatever is best for your problem.
- Simple parallelization on the algorithm level or during the calculation of numerical derivatives.
- Error handling and persistent logging to deal with or diagnose numerical instabilities in objective functions.
- Compatibility with automatic differentiation via JAX.
For more details and examples, have a look at this tutorial.
optimagic was formerly called estimagic, because it also provides functionality to perform statistical inference on estimated parameters. estimagic is now a subpackage of optimagic.
Install with
pip install optimagic
or via conda-forge with
conda install -c conda-forge optimagic
For details and instructions, check out the documentation.
Roadmap
We are currently working on the following topics:
Enhancement Proposal 2: Static typing
This enhancement proposal explains the adoption of static typing in optimagic. It has three major goals:
- Users will benefit from IDE tools such as easier discoverability of options and autocompletion.
- Developers and users will find code easier to read due to type hints.
- The codebase will become more robust due to static type checking and use of stricter types in internal functions.
The detailed enhancement proposal can be found here. It contains several improvements that go beyond the adoption of static typing such as a better integration with JAX and a better interface for least-squares problems.
All breaking changes of this enhancement proposal have been implemented with version 0.5.0. The remaining changes will be implemented until version 0.6.0.
Enhancement Proposal 3: Alignment with SciPy
This enhancement proposal describes the changes needed to become a superset of
scipy.optimize.minimize
. After it is completed, all code that was written for SciPy’s
minimize function will also run with optimagic. This lowers switching costs and lets
users gradually adopt the advanced features of optimagic.
The detailed enhancement proposal can be found here
All breaking changes of this enhancement proposal have been implemented with version 0.5.0. The remaining changes will be implemented until version 0.6.0.
Wrapping more optimizers
We are open for requests to add any optimizer with Python bindings. We are currently planning to wrap the following libraries or optimizers:
- Powell’s gradient free optimizers via PRIMA
- Optimizers from Nevergrad
- Optimizers from Bayesian Optimization
- Migrad via iminuit
- SNOPT, SLSQP, NLPQLP, NSGA2, PSQP, ParOpt, CONMIN and ALPSO, either by wrapping PyOptSparse or directly.
- Knitro
- Optimizers from ensmallen via pyensmallen
- Optimizers from the book Practical Mathematical Optimization (source code distributed via optimagic with permission from the authors)
- Optimizers from the argmin package in Rust
Acknowledgments
optimagic originated in the Open Source Economics group at the University of Bonn in 2019. Since 2022 it is NumFocus affiliated. Optimagic has received funding from the University of Bonn, TRA Modeling, and Hoover Institution.