An improved communication-randomness tradeoff

0Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Two processors receive inputs X and Y respectively. The communication complexity of the function f is the number of bits (as a function of the input size) that the processors have to exchange to compute f(X, Y) for worst case inputs X and Y. The List-Non-Disjointness problem (X = (x1, . . . ,xn), Y = (y1, . . . , yn), xi, yj ∈ Z2n, to decide whether ∃j xj = yj) exhibits maximal discrepancy between deterministic n2 and Las Vegas (Θ(n)) communication complexity. Fleischer, Jung, Mehlhorn (1995) have shown that if a Las Vegas algorithm expects to communicate Ω(n log n) bits, then this can be done with a small number of coin tosses. Even with an improved randomness efficiency, this result is extended to the (much more interesting) case of efficient algorithms (i.e. with linear communication complexity). For any R ∈ ℕ, R coin tosses are sufficient for O(n + n2/2R) transmitted bits. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Fürer, M. (2004). An improved communication-randomness tradeoff. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2976, 444–454. https://doi.org/10.1007/978-3-540-24698-5_48

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free