Comparison of the entropy of the product of two random variables

379 Views Asked by At

$X$ and $Y$ are two discrete random variables. That's all the information we have about these two RV.

Then how can we compare the entropy of $H(XY)$ with $H(X)$ and $H(X)+H(Y)$?

1

There are 1 best solutions below

0
On

If $X$ chooses the symbols $\{x_1,x_2,\cdots \}$ and $Y$ chooses the symbols $\{y_1,y_2,\cdots \}$ with probabilities $\{p_1,p_2,\cdots \}$ and $\{q_1,q_2,\cdots \}$ respectively and in order, then the random variable $XY$ chooses $\{x_iy_j\}$ with probability $p_ip_j$. If all the symbols $\{x_iy_j\}$ are distinct, then we have maximum uncertainty about $XY$ and there will be a bijection between $XY$ and $(X,Y)$. This leads to independent $X$ and $Y$ for which $$H(XY)=H(X,Y)=H(X)+H(Y)$$therefore we always have $$H(XY)\le H(X)+H(Y)$$

Though, nothing can be said about $H(XY)$ and $H(X)$. If $X$ and $Y$ independently possess $1$ or $2$ and $3$ or $4$ respectively equally likely, then$$H(XY)=2>1=H(X)$$ and if $\Pr\{X=-1\}=\Pr\{X=1\}={1\over 2}$ and $Y=X$ a.e., then $$H(XY)=H(X^2)=H(1)=0<1=H(X)$$