The bit complexity of distributed sorting

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We study the bit complexity of the sorting problem for asynchronous distributed systems. We show that for every network with a tree topology T, every sorting algorithm must send at least Ω(ΔT log L/N) bits in the worst case, where (0, 1,.., L) is the set of possible initial values, and ΔT is the sum of distances from all the vertices to a median of the tree. In addition, we present an algorithm that sends at most O(ΔT log L-N/ΔT) bits for such trees; These bounds are tight if either L = Ω(N 1+ε) or ΔT = Ω(N2). We also present results regarding average distributions. These results suggest that sorting is an inherently non-distributive problem, since it requires an amount of information transfer, that is equal to the concentration of all the data in a single processor, which then distributes the final results to the whole network. The importance of bit complexity - as opposed to message complexity - stems also from the fact that in the lower bound discussion, no assumptions are made as to the nature of the algorithm.

Cite

CITATION STYLE

APA

Gerstel, O., & Zaks, S. (1993). The bit complexity of distributed sorting. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 726 LNCS, pp. 181–191). Springer Verlag. https://doi.org/10.1007/3-540-57273-2_54

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free