Lecture 14 - 2025 / 5 / 27
Communication Complexity
Definition (Communication Complexity): f(x,y), x,y∈{0,1}∗, f(⋅)∈{0,1}. Alice has x and Bob has y. The goal is to compute f(x,y). Communication complexity is the number of bits Alice and Bob have to communicate in worst-case.
CC(f)≥log2χ(f)
where χ(f)=minpartitionR, and R:=#chronomatic rectangles.
Remark: log2χ(f) represents that Alice and Bob should both know the results, wich may actually differs 1 bit from the answer. But this is meaningless comparing to n.
Example: EQ(x,y)=[x=y]
1000010000100001“Send 1 bit”
Boxed elements can't be in the same chronomatic rectangles. χ(EQ)≥2n, so CC(f)≥n
Example: NAND(x,y)=¬(x∧y)
1111101011001000
Boxed elements can't be in the same chronomatic rectangles. χ(NAND)≥2n, so CC(f)≥n
For f(x,y), x,y∈{0,1}n, let M(f)2n×2n be the matrix. What's the relation of rank(M(f)) and χ(f)?
Recall that χ(f) just give a partition, decompositing M(f) into some rank 1 matrices. So rank(M(f))≤χ(f) using rank(A+B)≤rank(A)+rank(B).