Lower bounds for the noisy broadcast problem

Navin Goyal*, Guy Kindler, Michael Saks

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

35 Scopus citations

Abstract

We prove the first nontrivial (superlinear) lower bound in the noisy broadcast model defined by El Gamal in [Open problems presented at the 1984 workshop on Specific Problems in Communication and Computation sponsored by Bell Communication Research, in Open Problems in Communication and Computation, T. M. Cover and B. Gopinath, eds., Springer-Verlag, New York, 1987, pp. 60-62]. In this model there are n + 1 processors P0, P1, ..., P n, each of which is initially given a private input bit x i. The goal is for Po to learn the value of f(x1, ..., xn), for some specified function f, using a series of noisy broadcasts. At each step a designated processor broadcasts one bit to all of the other processors, and the bit received by each processor is flipped with fixed probability (independently for each recipient). In 1988, Gallager [IEEE Trans. Inform. Theory, 34 (1988), pp. 176-180] gave a noise-resistant protocol that allows P0 to learn the entire input with constant probability in O(n log log n) broadcasts. We prove that Gallager's protocol is optimal, up to a constant factor. Our lower bound follows by reduction from a lower bound for generalized noisy decision trees, a new model which may be of independent interest. For this new model we show a lower bound of Ω(n log n) on the depth of a tree that learns the entire input. While the above lower bound is for an n-bit function, we also show an Ω(n log log n) lower bound for the number of broadcasts required to compute certain explicit boolean-valued functions, when the correct output must be attained with probability at least 1 - n for a constant parameter α > 0 (this bound applies to all threshold functions as well as any other boolean-valued function with linear sensitivity). This bound also follows by reduction from a lower bound of Ω(n log n) on the depth of generalized noisy decision trees that compute the same functions with the same error. We also show a (nontrivial) Ω(n) lower bound on the depth of generalized noisy decision trees that compute such functions with small constant error. Finally, we show the first protocol in the noisy broadcast model that computes the Hamming weight of the input using a linear number of broadcasts.

Original languageEnglish
Pages (from-to)1806-1841
Number of pages36
JournalSIAM Journal on Computing
Volume37
Issue number6
DOIs
StatePublished - 2007
Externally publishedYes

Keywords

  • Computation in presence of noise
  • Distributed computation

Fingerprint

Dive into the research topics of 'Lower bounds for the noisy broadcast problem'. Together they form a unique fingerprint.

Cite this