When dealing with big data problems it is crucial to design methods able to decompose the original problem into smaller and more manageable pieces. Parallel methods lead to a solution by concurrently working on different pieces that are distributed among available agents, so that exploiting the computational power of multi-core processors and therefore efficiently solving the problem. Beyond gradient-type methods, that can of course be easily parallelized but suffer from practical drawbacks, recently a convergent decomposition framework for the parallel optimization of (possibly non-convex) big data problems was proposed. Such framework is very flexible and includes both fully parallel and fully sequential schemes, as well as virtually all possibilities in between. We illustrate the versatility of this parallel decomposition framework by specializing it to different well-studied big data problems like LASSO, logistic regression and support vector machines training. We give implementation guidelines and numerical results showing that proposed parallel algorithms work very well in practice.
CITATION STYLE
Sagratella, S. (2016). Convergent Parallel Algorithms for Big Data Optimization Problems. Studies in Big Data, 18, 461–474. https://doi.org/10.1007/978-3-319-30265-2_20
Mendeley helps you to discover research relevant for your work.