Conditional Entropy on a quantized random variable

116 Views Asked by At

Let $X$ and $Y$ be discrete random variables. The conditional entropy of $Y$ is:

$$H(Y|X) = \sum_{x}P(x)H(Y|X=x)$$

Assume we quantize $X$ with a quantizer $Q(X)$. How could we prove the resulting conditional entropy is greater than the original conditional entropy?

$$H(Y|X) \leq H(Y|Q(X))$$

1

There are 1 best solutions below

0
On

Here I just want to elaborate on what @stochasticboy321 commented. We know that $Q(X)$ is conditionally independent of $Y$ given $X$. So we have a Markov chain like this:

$$Y ----- X ----- Q(X)$$

Now, we are able to use The Data Processing Inequality(DPI) which states that given random variables X,Y and Z that form a Markov chain in the order X→Y→Z, then the mutual information between X and Y is greater than or equal to the mutual information between X and Z. Using this we have:

$$I(Y;X) \geq I(Y;Q(X))$$

By expanding the mutual information in terms of entropy we have:

$$H(Y) - H(Y|X) \geq H(Y)-H(Y|Q(X))$$

This leads us to our desired result: $$H(Y|Q(X)) \geq H(Y|X)$$