Lecture 6 - 2025 / 3 / 25
Joint differential entropy
h(X,Y)=−∫f(x,y)logf(x,y)dxdy
Conditional differential entropy
h(X∣Y)=−∫fX,Y(x,y)logfX∣Y(x,y)dxdy
Definition (mutual information): Let X,Y be continuous r.v. fX,Y,fX∣Y,fY∣X...
I(X;Y)=h(X)−h(X∣Y)=∫fX,Y(x,y)logfX(x)fY(y)fX,Y(x,y)dxdy
I(X,Y)=limΔ→0I(XΔ,YΔ)
Definition (KL Divergence): Let f,g be two pdf of continuous r.v. X,Y,
D(f∥g)=∫f(x)logg(x)f(x)dx
D(f,g)=limΔ→0D(PΔ,QΔ)
Kolmogorov Complexity
Computation, Algorithm
(1) Problems: Decision Problem
(2) Algorithm: Turing Machine
Given L⊆{0,1}∗
decide if x∈L for every x∈{0,1}∗