Abstract
This paper describes two optimal subgradient algorithms for solving structured large-scale convex constrained optimization. More specifically, the first algorithm is optimal for smooth problems with Lipschitz continuous gradients and for Lipschitz continuous nonsmooth problems, and the second algorithm is optimal for Lipschitz continuous nonsmooth problems. In addition, we consider two classes of problems: (i) a convex objective with a simple closed convex domain, where the orthogonal projection onto this feasible domain is efficiently available; and (ii) a convex objective with a simple convex functional constraint. If we equip our algorithms with an appropriate prox-function, then the associated subproblem can be solved either in a closed form or by a simple iterative scheme, which is especially important for large-scale problems. We report numerical results for some applications to show the efficiency of the proposed schemes.
Author supplied keywords
Cite
CITATION STYLE
Ahookhosh, M., & Neumaier, A. (2017). Optimal subgradient algorithms for large-scale convex optimization in simple domains. Numerical Algorithms, 76(4), 1071–1097. https://doi.org/10.1007/s11075-017-0297-x
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.