Probabilistic communication complexity

作者:

Highlights:

摘要

Communication is a bottleneck in many distributed computations. In VLSI, communication constraints dictate lower bounds on the performance of chips. The two-processor information transfer model measures the communication requirements to compute functions. We study the unbounded error probabilistic version of this model. Because of its weak notion of correct output, we believe that this model measures the “intrinsic” communication complexity of functions. We present exact characterizations of the unbounded error communication complexity in terms of arrangements of hyperplanes and approximations of matrices. These characterizations establish the connection with certain classical problems in combinatorial geometry which are concerned with the configurations of points in d-dimensional real space. With the help of these characterizations, we obtain some upper and lower bounds on communication complexity. The upper bounds which we obtained for the functions—equality and verification of Hamming distance—are considerably better than their counterparts in the deterministic, the nondeterministic, and the bounded error probabilistic models. We also exhibit a function which has log n complexity. We present a counting argument to show that most functions have linear complexity. Further, we apply the logarithmic lower bound on communication complexity to obtain an Ω(n log n) bound on the time of 1-tape unbounded error probabilistic Turing machines. We believe that this is the first nontrivial lower bound obtained for such machines.

论文关键词:

论文评审过程:Received 15 February 1985, Revised 26 February 1986, Available online 2 December 2003.

论文官网地址:https://doi.org/10.1016/0022-0000(86)90046-2