An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function

6Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this work we aim to solve a convex-concave saddle point problem, where the convex-concave coupling function is smooth in one variable and nonsmooth in the other and not assumed to be linear in either. The problem is augmented by a nonsmooth regulariser in the smooth component. We propose and investigate a novel algorithm under the name of OGAProx, consisting of an optimistic gradient ascent step in the smooth variable coupled with a proximal step of the regulariser, and which is alternated with a proximal step in the nonsmooth component of the coupling function. We consider the situations convex-concave, convex-strongly concave and strongly convex-strongly concave related to the saddle point problem under investigation. Regarding iterates we obtain (weak) convergence, a convergence rate of order O(1K) and linear convergence like O(θK) with θ< 1 , respectively. In terms of function values we obtain ergodic convergence rates of order O(1K), O(1K2) and O(θK) with θ< 1 , respectively. We validate our theoretical considerations on a nonsmooth-linear saddle point problem, the training of multi kernel support vector machines and a classification problem incorporating minimax group fairness.

Cite

CITATION STYLE

APA

Boţ, R. I., Csetnek, E. R., & Sedlmayer, M. (2023). An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function. Computational Optimization and Applications, 86(3), 925–966. https://doi.org/10.1007/s10589-022-00378-8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free