A comparison of optimization solvers for log binomial regression including conic programming

6Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Relative risks are estimated to assess associations and effects due to their ease of interpretability, e.g., in epidemiological studies. Fitting log-binomial regression models allows to use the estimated regression coefficients to directly infer the relative risks. The estimation of these models, however, is complicated because of the constraints which have to be imposed on the parameter space. In this paper we systematically compare different optimization algorithms to obtain the maximum likelihood estimates for the regression coefficients in log-binomial regression. We first establish under which conditions the maximum likelihood estimates are guaranteed to be finite and unique, which allows to identify and exclude problematic cases. In simulation studies using artificial data we compare the performance of different optimizers including solvers based on the augmented Lagrangian method, interior-point methods including a conic optimizer, majorize-minimize algorithms, iteratively reweighted least squares and expectation-maximization algorithm variants. We demonstrate that conic optimizers emerge as the preferred choice due to their reliability, lack of requirement to tune hyperparameters and speed.

Cite

CITATION STYLE

APA

Schwendinger, F., Grün, B., & Hornik, K. (2021). A comparison of optimization solvers for log binomial regression including conic programming. Computational Statistics, 36(3), 1721–1754. https://doi.org/10.1007/s00180-021-01084-5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free