We propose a subspace-accelerated Bregman method for the linearly constrained minimization of functions of the form f(u) + τ1 kuk1 + τ2 kD uk1, where f is a smooth convex function and D represents a linear operator, e.g., a finite difference operator, as in anisotropic total variation and fused lasso regularizations. Problems of this type arise in a wide variety of applications, including portfolio optimization, learning of predictive models from functional magnetic resonance imaging (fMRI) data, and source detection problems in electroencephalography. The use of kD uk1 is aimed at encouraging structured sparsity in the solution. The subspaces where the acceleration is performed are selected so that the restriction of the objective function is a smooth function in a neighborhood of the current iterate. Numerical experiments for multi-period portfolio selection problems using real data sets show the effectiveness of the proposed method.
CITATION STYLE
de Simone, V., Di Serafino, D., & Viola, M. (2020). A subspace-accelerated split Bregman method for sparse data recovery with joint ℓ1-type regularizers. Electronic Transactions on Numerical Analysis, 53, 406–425. https://doi.org/10.1553/etna_vol53s406
Mendeley helps you to discover research relevant for your work.