Lecture 15 - 2025 / 6 / 3

EQ(x,y)=[x=y]EQ(x, y) = [x = y], x,y{0,1}x, y \in \{0, 1\}^*

  1. randomized algorithm
  2. high probability correctness

Prime Number Theorem (PNT): #primesmmlnm\text{\#primes} \le m \sim \dfrac{m}{\ln m}

Protocol:

Alice has x{0,1}nx \in \{0, 1\}^n i.e. x=x0x1xn1x = x_0x_1\cdots x_{n-1}, and Bob has y{0,1}ny \in \{0, 1\}^n i.e. y=y0y1yn1y = y_0y_1\cdots y_{n-1}. They want to comute EQ(x,y)EQ(x, y).

  1. Choose a prime p[n100,n101]p \in [n^{100}, n^{101}]
  2. Alice: randomly choose an integer t{0,1,,p1}t \in \{0, 1, \cdots, p - 1\}
  3. Alice: Compute the polynomial f(t)=(xn1tn1+xn2tn2++x0)modpf(t) = (x_{n-1} t^{n-1} + x_{n-2} t^{n-2} + \cdots + x_0) \bmod p
  4. Alice: Send tt and f(t)f(t) to Bob (using O(logp)=O(logn)O(\log p) = O(\log n) bits)
  5. Bob: Receiving tt and f(t)f(t), compute g(t)=(yn1tn1+yn2tn2++y0)modpg(t) = (y_{n-1} t^{n-1} + y_{n-2} t^{n-2} + \cdots + y_0) \bmod p
  6. Bob: Compare f(t)f(t) and g(t)g(t), if they are equal, output 11, otherwise output 00.

Error occurs when fgf \ne g but f(t)=g(t)f(t) = g(t), i.e. tt is a root of fgf - g. The probability that t{0,1,,p1}t\in \{0, 1, \cdots, p - 1\} is a root of fgf - g is at most n1p=O(n99)\dfrac{n-1}{p} = O(n^{-99}).

Max Entropy

XX is a dd-dim random vector, E(X)=0E(X) = 0, Cov(X)=E(XXT)=ΣCov(X) = E(X X^T) = \Sigma. Max entropy distribution is the multivariate Gaussian distribution N(0,Σ)\mathcal N(0, \Sigma).

Theorem. Let XX be a dd-dim random vector, E(X)=0E(X) = 0, Cov(X)=ΣCov(X) = \Sigma. Let YN(0,Σ)Y\sim \mathcal N(0, \Sigma). Then h(X)h(Y)h(X) \le h(Y).

D(XY)=fX(t)lnfX(t)fY(t)dt=h(X)fX(t)lnfY(t)dt=h(X)ln1(2π)d/2+fX(t)tTΣ1t2dt=h(X)ln1(2π)d/2+fY(t)tTΣ1t2dt=h(X)+h(Y)0\begin{aligned} D(X \| Y) & = \int f_X(t) \ln \frac{f_X(t)}{f_Y(t)} \text d t \\ & = - h(X) - \int f_X(t) \ln f_Y(t) \text d t \\ & = - h(X) - \ln \frac{1}{(2\pi)^{d/2}}+ \int f_X(t) \frac{t^T \Sigma^{-1} t}{2} \text d t \\ & = - h(X) - \ln \frac{1}{(2\pi)^{d/2}} +\int f_Y(t) \frac{t^T \Sigma^{-1} t}{2} \text d t \\ & = - h(X) + h(Y) \ge 0 \end{aligned}