Abstract
Optim provides a range of optimization capabilities written in the Julia programming language (Bezanson et al. 2017). Our aim is to enable researchers, users, and other Julia packages to solve optimization problems without writing such algorithms themselves. The package supports optimization on manifolds, functions of complex numbers, and input types such as arbitrary precision vectors and matrices. We have implemented routines for derivative free, first-order, and second-order optimization methods. The user can provide derivatives themselves, or request that they are calculated using automatic differentiation or finite difference methods. The main focus of the package has currently been on unconstrained optimization, however, box-constrained optimization is supported, and a more comprehensive support for constraints is underway. Similar to Optim, the C library NLopt (Johnson 2008) contains a collection of nonlinear optimization routines. In Python, scipy.optimize supports many of the same algorithms as Optim does, and Pymanopt (Townsend, Niklas, and Weichwald 2016) is a toolbox for manifold optimization. Within the Julia community, the packages BlackBoxOptim.jl and Optimize.jl provide optimization capabilities focusing on derivative-free and large-scale smooth problems respectively. The packages Convex.jl and JuMP.jl (Dunning, Huchette, and Lubin 2017) define modelling languages for which users can formulate optimization problems. In contrast to the previously mentioned optimization codes, Convex and JuMP work as abstraction layers between the user and solvers from a other packages. Optimization routines As of version 0.14, the following optimization routines are available. • Second-order methods – Newton – Newton with trust region – Hessian-vector with trust region • First-order methods – BFGS – L-BFGS (with linear preconditioning) – Conjugate gradient (with linear preconditioning) – Gradient descent (with linear preconditioning) • Acceleration methods – Nonlinear GMRES – Objective acceleration • Derivative-free methods – Nelder–Mead – Simulated annealing – Particle swarm • Interval bound univariate methods Mogensen et al., (2018). Optim: A mathematical optimization package for Julia. Journal of Open Source Software, 3(24), 615. https://doi.org/10.21105/joss.00615 1 – Brent's method – Golden-section search The derivative based methods use line searches to assist convergence. Multiple line search algorithms are available, including interpolating backtracking and methods that aim to satisfy the Wolfe conditions. The optimization routines in this package have been used in both industrial and academic contexts. For example, parts of the internal work in the company Ternary Intelligence Inc. (Paramonov 2017) rely on the package. Notably, an upcoming book on optimization (Kochenderfer and Wheeler Forthcoming, 2018) uses Optim for its examples. Optim has been used for a wide range of applications in academic research, including optimal con-trol (Riseth, Dewynne, and Farmer 2017; Riseth 2017a), parameter estimation (Riseth and Taylor-King 2017; Rackauckas and Nie 2017; and Dony, He, and Stumpf 2018), quan-tum physics (Damle, Levitt, and Lin 2018), crystalline modelling (Chen and Ortner 2017; Braun, Buze, and Ortner 2017), and the large-scale astronomical cataloguing project Ce-leste (Regier et al. 2015; Regier et al. 2016). A new acceleration scheme for optimization (Riseth 2017b), and a preconditioning scheme for geometry optimisation (Packwood et al. 2016) have also been tested within the Optim framework.
Cite
CITATION STYLE
K Mogensen, P., & N Riseth, A. (2018). Optim: A mathematical optimization package for Julia. Journal of Open Source Software, 3(24), 615. https://doi.org/10.21105/joss.00615
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.