Lecture 6 - 2025 / 3 / 25

Joint differential entropy

h(X,Y)=f(x,y)logf(x,y)dxdyh(X, Y) = -\int f(x, y) \log f(x, y) \text d x \text d y

Conditional differential entropy

h(XY)=fX,Y(x,y)logfXY(x,y)dxdyh(X | Y) = -\int f_{X,Y}(x, y) \log f_{X|Y} (x, y) \text d x \text d y

Definition (mutual information): Let X,YX, Y be continuous r.v. fX,Y,fXY,fYXf_{X,Y}, f_{X|Y}, f_{Y|X}...

I(X;Y)=h(X)h(XY)=fX,Y(x,y)logfX,Y(x,y)fX(x)fY(y)dxdyI(X; Y) = h(X) - h(X | Y) = \int f_{X,Y}(x, y) \log \frac{f_{X,Y}(x, y)}{f_X(x) f_Y(y)} \text d x \text d y

I(X,Y)=limΔ0I(XΔ,YΔ)I(X, Y) = \lim_{\Delta \to 0} I(X_\Delta, Y_\Delta)

Definition (KL Divergence): Let f,gf, g be two pdf of continuous r.v. X,YX, Y,
D(fg)=f(x)logf(x)g(x)dxD(f \| g) = \int f(x) \log \frac{f(x)}{g(x)} \text d x

D(f,g)=limΔ0D(PΔ,QΔ)D(f, g) = \lim_{\Delta \to 0} D(P_{\Delta}, Q_{\Delta})

Kolmogorov Complexity

Computation, Algorithm

(1) Problems: Decision Problem

(2) Algorithm: Turing Machine

Given L{0,1}L \sube \{0, 1\}^*

decide if xLx \in L for every x{0,1}x \in \{0, 1\}^*