Inequality in differential entropy

355 Views Asked by At

In the book on "Network Information Theory" by El Gamal, there is a question to choose the correct relation ($\geq,\leq,=$) for the following:

Let $X$ be a continuous random variable. Let $Y\sim N(0,1)$ (Standard normal distributed) and $Y$ is independent of $X$. Let $a \geq 1$. Then what is the relation between $h(X+aY)$ and $h(X+Y)$?

Here $h(.)$ is the differential entropy operator defined as follows: If $X$ has pdf $f_X$, then $$h(X) = -\int_{-\infty}^\infty f_X(x)\log_2(f_X(x))dx$$

My guess is that it should be $\geq$. For instance, if $X$ is normal with variance $P$, then I have showed that it is $\geq$. For general $X$, i thought of manipulating the entropy terms with some clever conditioning but I didn't get anywhere. As a last resort, I even considered expanding out each term and trying to do a comparison but again, to no avail.

I am sure there is a clever trick/transformation here that would give the answer, but I am unable to see it. Hence my request for help.

Edit: You are welcome to use the Entropy Power Inequality if it helps. In fact any property of differential entropy may be used here. Also as $Y$ is standard normal, $h(Y) = \frac{\log_2(2\pi e )}{2}$.

1

There are 1 best solutions below

1
On BEST ANSWER

Hint: Let $Z_a=X+aY$, and $w(a)=h(Z_a)=h(X+aY)$, in nats. Then show that $w(a)$ increases, using de Bruijn’s identity :

$$ \frac{\partial }{\partial t} h(X+\sqrt{t}\,Y)=\frac{1}{2}J(X+\sqrt{t}\,Y)\ge 0$$

where $J(\cdot)$ is the Fisher information (see eg Cover & Thomas, Theorem 17.7.2).