Lecture 2: Importance of Problem Structures

Blog, Online Lectures

(Warning: These materials may be subject to lots of typos and errors. We are grateful if you could spot errors and leave suggestions in the comments, or contact the author at yjhan@stanford.edu.) 

This lecture continues the introduction of information-theoretic lower bounds without formally developing tools. The emphasis of this lecture is that, the dependence of the fundamental limits on the problem structure may be very subtle, where the results may change significantly even with tiny changes in the problem setting. Hence, one should be very careful when applying some known lower bounds to similar problems, and it is vital to fully understand the specific problem structure. We will use a specific example of communication complexity in the theoretical computer science literatue to illustrate this phenomenon. 

1. Equality Evaluation: Deterministic Communication Complexity 

Alice and Bob hold vectors x,y \in \{0,1\}^n respectively, and they would like to check whether x=y or not. However, they live very far from each other, and some limited resources of communication are available. What is the minimum number of communication bits to perform this task reliably, in the sense that they always find the correct answer for any pairs (x,y)? In other words, what is the deterministic communication complexity of the distributed computing problem for the equality function f(x,y)=1(x=y)

To answer this question, we need to formally define all possible communication protocols. Here we define a general class of blackboard communication protocols. 

Definition 1 (Blackboard Communication Protocol) A blackboard communication protocol over domain X\times Y with range Z is a binary tree where each internal node v is labeled by a function a_v: X\rightarrow \{0,1\} or by a function b_v: Y\rightarrow \{0,1\}, and each leaf is labeled by an element z\in Z.

The value of the protocol on input (x,y) is the label of the leaf reached by starting from the root and walking on the tree using the following rule: at each internal node labeled by a_v, walk left if a_v(x)=0 and right if a_v(x)=1; at each internal node labeled by b_v, walk left if b_v(y)=0 and right if b_v(y)=1. The cost of the protocol is the height of the tree. 

Intuitively speaking, there is a blackboard in front of both Alice and Bob in a blackboard communication protocol on which they can write binary bits, and this process continues sequentially where later bits can depend on the entire history. The formal definition above simply uses a binary tree to keep track of the history, and uses functions a_v, b_v to illustrate who will be writing on the blackboard at stage v. The communication cost of a given protocol is simply the maximum length of the message written on the blackboard, where the maximum is taken over all possible input pairs (x,y)\in X\times Y

Now the question we are asking is: among all blackboard communication protocols which always give the correct evaluation of the equality function, what is the minimum communication cost? This is also known as the deterministic communication complexity, where “deterministic” here means zero-error. 

To prove a lower bound on the deterministic complexity, we need to make the following key observation: 

Lemma 2 (Copy-paste Property) If a blackboard communication protocol arrives at the same leaf node v on both inputs (x_1,y_1) and (x_2,y_2), then on input (x_1,y_2) the protocol also arrives at v

Proof: For each internal nodes v' labeled by a_{v}(x) on the path from the root to v, the condition ensures that a_{v'}(x_1) = a_{v'}(x_2). Similarly, b_{v'}(y_1)=b_{v'}(y_2) if the internal node is labeled by b_{v'}(y). Then the result is immediate.  \Box 

The above copy-paste property shows the following limitation of any blackboard communication protocols: the set of all input pairs (x,y) arriving at any leaf node forms a rectangle, i.e., takes the form of X_v\times Y_v. Hence, if the value of the function to be computed is not a constant on large rectangles, each leaf node will need to take care of small rectangles and more leaves are required. This is the key insight of the following theorem. 

Theorem 3 (Log-rank Inequality) Let M\in \{0,1\}^{X\times Y} be a binary matrix defined as M(x,y)=f(x,y) for all x\in X, y\in Y. Then the deterministic communication complexity of computing f is at least \log_2 \text{\rm rank}(M), where the linear rank is understood over {\mathbb R}

Proof: For any leaf node v which outputs 1, define M_v(x,y) = 1(v\text{ is reached on input }(x,y)). Then clearly M= \sum_v M_v. By the sub-additivity of rank, we have 

 \text{rank}(M) \le \sum_v \text{rank}(M_v). \ \ \ \ \ (1)

However, by Lemma 2, we have \text{rank}(M_v)=1 for any leaf, so inequality (1) implies that the number of leaf nodes is at least \text{rank}(M), and we are done.  \Box

Applying Theorem 3 to the equality function, the matrix M is a 2^n\times 2^n identity matrix, and therefore the deterministic communication complexity is at least n. This is essentially tight: Alice can communicate the entire vector x to Bob using n bits and Bob then sends the 1-bit evaluation of the equality function back to Alice. Hence, the deterministic communication complexity of equality evaluation is \Theta(n)

2. Equality Evaluation: Randomized Private-coin Communication Complexity 

Next we consider a slight variant of the previous problem: what is the communication complexity if Alice and Bob can tolerate some maximum probability of error \varepsilon>0 on all inputs? This quantity is called the randomized communication complexity. Here to introduce randomness, we assume that both Alice and Bob have access to some (private, not shared) random number generators. We also assume that \varepsilon is a small constant (say, \varepsilon=0.01) and suppress the dependence of the communication complexity on \varepsilon

A striking result is that, if we allow some small probability of error (even of the order 1/\text{poly}(n)), the randomized communication complexity drops from \Theta(n) to O(\log n). Here is a communication protocol: by Bertrand’s postulate there exists some prime number p\in [n^2,2n^2]. Alice chooses such a p and z\sim \text{Unif}([p]), evaluates the polynomial 

 q(x,z) = x_0 + x_1z + \cdots + x_{n-1}z^{n-1} \pmod p, \ \ \ \ \ (2)

and sends (p,z,q(x,z)) to Bob using O(\log n) bits, where x=(x_0,x_1,\cdots,x_{n-1}). Bob then evaluates q(y,z) in (2) and outputs 1(q(x,z)=q(y,z)). Clearly, if x=y then Bob always outputs 1 correctly. If x\neq y, the map z\mapsto q(x,z) - q(y,z) is a non-zero polynomial with degree at most n, so it has at most n zeros in \mathbb{F}_p. Consequently, Bob makes mistakes with probability at most n/p\le 1/n for any x\neq y.

The previous upper bound shows that Theorem 3 no longer holds for the randomized communication complexity. In fact, a careful inspection of the proof shows that the zero-error property is crucial for Theorem 3 to hold. However, we may still perform a deterministic simulation of any randomized protocol and reduce to the previous case. 

Theorem 4 Let D(f) and R(f) be the deterministic and randomized private-coin communication complexity of computing f, respectively. Then

 D(f) \le 2^{R(f)}\left(-\log_2 \left(\frac{1}{2}-\varepsilon\right) + R(f) \right). \ \ \ \ \ (3)

Proof: Given a randomized protocol, Alice can transmit the probabilities of arriving at all 2^{R(f)} leaf nodes based on her input x. Given these probabilities, Bob can privately evaluate the probabilities of arriving at each leaf based on the inputs (x,y) (which is made possible by the product law of independent events thanks to the private-coin assumption). Then Bob sums up probabilities for leaf nodes which output 0 and 1, respectively, and takes the majority vote. Since the error probability of the randomized protocol is at most \varepsilon<1/2, the final output given by the majority vote is always correct. The proof of (3) is completed by noting that these probabilities sent by Alice can be made within precision (\frac{1}{2}-\varepsilon)2^{-R(f)} without hurting the majority vote.  \Box 

Applying Theorem 4 to the equality function f, we immediately obtain that R(f) = \Omega(\log n). Hence, the randomized private-coin communication complexity of equality evaluation is \Theta(\log n)

3. Equality Evaluation: Randomized Public-coin Communication Complexity 

We ask again the same question in the previous section, but now Alice and Bob have shared randomness. Careful readers may have noticed a subtle point used in the proof of Theorem 4, i.e., the identity \mathop{\mathbb P}(A\cap B)=\mathop{\mathbb P}(A)\mathop{\mathbb P}(B) for events A,B given by independent private randomness. Hence, the \Omega(\log n) lower bound does not directly apply to the public-coin scenario, and the public-coin protocols may perform much better. This is indeed the case – the randomized public-coin communication complexity is actually \Theta(1)

There is nothing to prove for the lower bound. For the upper bound, here is a simple procotol: both Alice and Bob draw (the same collection of) independent vectors z_1, \cdots, z_m\sim \text{Unif}(\mathbb{F}_2^n). Then Alice sends an m-bit vector (v_1^\top x, v_2^\top x, \cdots, v_m^\top x) to Bob, and Bob claims that x=y if and only if v_i^\top x = v_i^\top y for all i\in [m]. Clearly, this protocol errs with probability at most 2^{-m} on all inputs. Hence, the randomized public-coin communication complexity of equality evaluation is \Theta(1)

4. Equality Evalution: Randomized One-round Private-coin Communication Complexity 

Now we consider a final variant of the distributed equality evaluation problem, where we assume a randomized private-coin protocol but restrict the protocol to be only one-round. Recall that the previous blackboard communication protocol allows for multiple rounds of interaction, i.e., Alice may provide feedbacks to the messages transmitted by Bob, and vice versa. In the one-round scenario, Alice and Bob can only send k-bit messages once to a central referee, and the target is to minimize the communication cost k while controlling the maximum error probability below \varepsilon. Then what is the communication complexity in this scenario? 

Surprisingly, the communication complexity is \Theta(\sqrt{n}), which is different from all previous ones. We first prove this lower bound. Notice that the one-round protocol has a more concise representation than a binary tree, i.e., any randomized strategy of Alice or Bob may be written as stochastic mappings from [2^n] to [2^k]. In particular, the copy-paste property in Lemma 2 becomes insufficient for our purposes. Let A,B\in [0,1]^{2^n\times 2^k} be the transition matrices of the stochastic mappings of Alice and Bob, respectively, and R\in [0,1]^{2^k\times 2^k} be the matrix with R(m_A,m_B) being the probability that the referee outputs 1 given messages (m_A, m_B). Now (ARB^\top)_{x,y} denotes the probability of outputting 1 on inputs (x,y), the correctness property ensures that 

 \|\text{vec}(ARB^\top - I_{2^n}) \|_\infty \le \varepsilon, \ \ \ \ \ (4)

where I_m denotes the m\times m identity matrix, and \|\cdot \|_\infty denotes the L_\infty norm of vectors. Hence, (4) implies that ARB^\top is an approximate non-negative matrix factorization of the identity matrix, and 2^k is the approximate non-negative rank of the identity matrix. The following result in linear algebra provides a lower bound on k

Theorem 5 Let 0<\varepsilon\le 0.1. If there exist stochastic matrices A,B\in [0,1]^{N\times m} and a non-negative matrix R\in [0,1]^{m\times m} such that \|\text{\rm vec}(ARB^\top - I_N)\|_\infty\le \varepsilon, then

 m \ge 2^{\Omega(\sqrt{\log N})}.

The proof of Theorem 5 is quite involved, and we refer interested readers to the reference cited in bibliographic notes. Now k=\Omega(\sqrt{n}) is an immediate consequence of (4) and Theorem 5. 

To show the upper bound k=O(\sqrt{n}), the idea is to simulate a randomized protocol by deterministic ones. Recall that in the previous section, there is a randomized public-coin protocol which requires O(1) bits of communication. Let R_1, \cdots, R_m be m independent deterministic realizations of the randomized protocol with maximum error probability \varepsilon, then by Hoeffding’s inequality, for any fixed input (x,y) we have 

 \mathop{\mathbb P}\left(\frac{1}{m}\sum_{i=1}^m 1(R_i(x,y) \neq 1(x=y)) \ge 2\varepsilon \right) \le 2\exp\left(-2m\varepsilon^2\right). \ \ \ \ \ (5)

Hence, by (5) and the union bound over 2^{2n} possible inputs, for fixed \varepsilon>0 we may simply choose m=O(n) such that the average probability of error of m deterministic protocols on each inputs is at most 2\varepsilon

Now the private-coin protocol is as follows: put the above m deterministic protocols in a \sqrt{m}\times\sqrt{m} matrix. Alice draws i\sim \mathsf{Unif}([\sqrt{m}]), runs all protocols in i-th row and sends all outputs to the referee (together with i). Similarly, Bob draws j\sim \mathsf{Unif}([\sqrt{m}]) and runs all protocols in j-th column. The central referee simply looks at the protocol at the (i,j)-th entry, whose probability of error is exactly the average probability of error of these m deterministic protocols, which is at most 2\varepsilon. Meanwhile, the communication complexity of this protocol is O(\sqrt{m}) = O(\sqrt{n}), as desired. 

5. Bibliographic Notes 

Communication complexity is a central topic in theoretical computer science. The blackboard communication protocol is proposed by Yao in 1979, where some central tools (including Theorems 3 and 4) are available in the excellent book by Kushilevitz and Nisan (1996). The proof of Theorem 5 is given by Newman and Szegedy (1996), and the simulation idea is due to Newman (1991). 

  1. Andrew Chi-Chih Yao, Some complexity questions related to distributive computing (preliminary report). In Proceedings of the eleventh annual ACM symposium on Theory of computing, pages 209–213. ACM, 1979. 
  2. Eyal Kushilevitz and Noam Nisan, Communication complexity. Cambridge University Press. 1996. 
  3. Ilan Newman and Mario Szegedy, Public vs private coin flips in one round communication games, In Proceedings of the 28th annual ACM symposium on Theory of computing, pages 561–570. ACM, 1996. 
  4. Ilan Newman, Private vs. common random bits in communication complexity. Information processing letters 39.2 (1991): 67–71. 

One thought on “Lecture 2: Importance of Problem Structures

Leave a Reply