Lecture 12 - 2025 / 5 / 13
Shannon Channel Coding Theorem
C:= Channel Compacity =maxP(X)I(X;Y)
- R<C, ∃ error correcting code, err→0
- R>C, ∀ error correcting code, err≥ε0>0
Proof. (Assume R<C)
-
Assumption on raw message, message ={m1,m2,⋯,m2nR}, uniform distribution.
-
Design codebook CB=(c1,⋯,c2nR)
m1m2m2nR→C1=(c11,c12,⋯,c1n)→C2=(c21,c22,⋯,c2n)⋮→C2nR=(c2nR1,c2nR2,⋯,c2nRn)
Idea: random coding
cij∼ i.i.d. P(X) for i∈[2nR],j∈[n]
P(X)=argmaxP(X)I(X;Y)
- Decoding algorithm / mode
Upon reveriving (y1,⋯,yn)?ci
Rule: ci=(x1,⋯,xn) s.t. (xi,⋯,xn,y1,⋯,yn) is a joint typical sequence w.r.t. P(X)P(Y∣X)
- (a) If no such (x1,⋯,xn), report fail (implicitly, if (y1,⋯,yn) isn't typical sequence, report fail).
- (b) If there are more than 1 codewords forming jointly typical sequence of (y1,⋯,yn), report fail.
- (c) If there is a unique ci=(x1,⋯,xn) forming jointly typical sequence (y1,⋯,yn), ok.
P(b)≤2nR⋅2−nI(X;Y)=2n(R−C)→0