Communication lower bounds for distributed convex optimization: Partition data on features

2Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

Abstract

Recently, there has been an increasing interest in designing distributed convex optimization algorithms under the setting where the data matrix is partitioned on features. Algorithms under this setting sometimes have many advantages over those under the setting where data is partitioned on samples, especially when the number of features is huge. Therefore, it is important to understand the inherent limitations of these optimization problems. In this paper, with certain restrictions on the communication allowed in the procedures, we develop tight lower bounds on communication rounds for a broad class of non-incremental algorithms under this setting. We also provide a lower bound on communication rounds for a class of (randomized) incremental algorithms.

Cite

CITATION STYLE

APA

Chen, Z., Luo, L., & Zhang, Z. (2017). Communication lower bounds for distributed convex optimization: Partition data on features. In 31st AAAI Conference on Artificial Intelligence, AAAI 2017 (pp. 1812–1818). AAAI press. https://doi.org/10.1609/aaai.v31i1.10912

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free