Relative entropy for joint distribution of length n

70 Views Asked by At

In the converse proof in information theory, using Fano's inequality, at the end we would have a term like

$I(X^n;Y^n)\leq nI(X;Y)$

I was wondering can we prove something like this for relative entropy? Something like this?

$D(P_{X^n}||Q_{X^n})\leq n D(P_X||Q_X)$

1

There are 1 best solutions below

0
On

No, I think you cannot. Take this counterexample:

Let $P_{X^n}$ be the joint distribution of $n$ consecutive samples of a stationary Gaussian process, where the stationary distribution is $P_X$. Let $Q_{X^n}=\prod_{i=1}^nP_X$ and $Q_X=P_X$. It follows that $D(P_X\Vert Q_X)=0$, while $D(P_{X^n}\Vert Q_{X^n})$ need not vanish.