Lecture 12 - 2025 / 5 / 13

Shannon Channel Coding Theorem

C:=C := Channel Compacity =maxP(X)I(X;Y)= \max_{P(X)} I(X; Y)

Proof. (Assume R<CR < C)

  1. Assumption on raw message, message ={m1,m2,,m2nR}= \{m_1, m_2, \cdots, m_{2^{nR}}\}, uniform distribution.

  2. Design codebook CB=(c1,,c2nR)CB = (c_1, \cdots, c_{2^{nR}})
    m1C1=(c11,c12,,c1n)m2C2=(c21,c22,,c2n)m2nRC2nR=(c2nR1,c2nR2,,c2nRn)\begin{aligned} m_1 &\to C_1 = (c_{11}, c_{12}, \cdots, c_{1n})\\ m_2 &\to C_2 = (c_{21}, c_{22}, \cdots, c_{2n})\\ &\vdots\\ m_{2^{nR}} &\to C_{2^{nR}} = (c_{2^{nR}1}, c_{2^{nR}2}, \cdots, c_{2^{nR}n}) \end{aligned}

Idea: random coding

cijc_{ij} \sim i.i.d. P(X)P(X) for i[2nR],j[n]i \in [2^{nR}], j \in [n]

P(X)=arg maxP(X)I(X;Y)P(X) = \argmax_{P(X)} I(X; Y)

  1. Decoding algorithm / mode
    Upon reveriving (y1,,yn)?ci(y_1, \cdots, y_n) \xrightarrow{?} c_i

Rule: ci=(x1,,xn)c_i = (x_1, \cdots, x_n) s.t. (xi,,xn,y1,,yn)(x_i, \cdots, x_n, y_1, \cdots, y_n) is a joint typical sequence w.r.t. P(X)P(YX)P(X)P(Y|X)

P(b)2nR2nI(X;Y)=2n(RC)0P(b) \le 2^{nR} \cdot 2^{-nI(X; Y)} = 2^{n(R-C)}\to 0