Entropy upper bound inequality for Sub-Gaussian Random Variable

1.4k Views Asked by At

We say that the random variable $Z$ is $\sigma^2$-subGaussian if $\mathbb{E} \exp(tZ) \leq \exp(t^2\sigma^2)/2$.

Define the $(x\log x)$-entropy (or simply the entropy) of a nonnegative random variable $Z$ by $\text{Ent}(Z):= \mathbb{E}(Z\log Z)- \mathbb{E}Z \log (\mathbb{E}Z)$. Here $\log$ is the natural logarithm.

I am interested to get the following bound: If $X-\mathbb{E}(X)$ is $\sigma^2$-subGaussian, then $\text{Ent}(\exp (t X))\leq t^2\alpha \;\mathbb{E}(\exp(tX)) $ for any $t\geq 0$, and $\alpha$ is some constant depending on $\sigma$.

How do I get the above bound? I have tried using Jensen's inequality and various manipulations but could not get the above.

2

There are 2 best solutions below

5
On

Let $M(t)= E[e^{tX}]$

Then $$Ent(e^{tX}) = E[e^{tX}tX] - E[e^{tX}]log(E[e^{tX}]) = tM'(t) - M(t)log(M(t)) = Mt^2\frac{d}{dt}\frac{log(M(t))}{t}$$

Given: $$E[e^{t(X-E[X])}] \leq \frac{e^{t^2 \sigma^2}}{2} \implies M(t) \leq e^{\frac{t^2 \sigma^2}{2}+tE[X]} \implies log(M(t)) \leq \frac{t^2 \sigma^2}{2} +tE[X]$$

Thus, for $t \neq 0$ , $$\frac{log(M(t))}{t} \leq \frac{t\sigma^2}{2} + E[X]$$

Consider $$\lim_{t \rightarrow 0} \frac{log(M(t))}{t} = \lim_{t \rightarrow 0} \frac{M'(t)}{M(t)}= E[X]$$

Let's assume: $Ent(\exp (t X))\leq t^2\alpha {E}[\exp(tX)] $ for any $t\geq 0$ $\implies Mt^2\frac{d}{dt}\frac{log(M(t))}{t} \leq t^2\alpha M $ $\implies$ $\frac{d}{dt}\frac{log(M(t))}{t} \leq \alpha$

Integrating both sides and using $\lim_{t \rightarrow 0} \frac{log(M(t))}{t}= E[X]$

we get:

$\frac{log(M(t))}{t} \leq E[X] + \alpha t$

Comparing this with earlier inequality we get: $\alpha \geq \frac{\sigma^2}{2}$

1
On

So if I understand right, if $X-\mathbb EX$ is $\sigma^2$-sub-Gaussian, then $$\forall t\in\mathbb R:\mathbb Ee^{t(X-\mathbb E X)}\le e^{t^2\sigma^2/2}\tag1$$ Taking the logarithm of both sides of (1), we have an expression in terms of the so-called cumulant-generating function of $X$: $$\forall t\in\mathbb R:\log\mathbb Ee^{tX}-\mathbb EtX\le \frac{t^2\sigma^2}2,\tag2$$ Multiplying both sides by $\mathbb E e^{tX}$ (which is always positive and depends only on $t$), we have: $$\forall t\in\mathbb R:\mathbb Ee^{tX}\log\mathbb Ee^{tX}-\mathbb Ee^{tX}\mathbb EtX\le \frac{t^2\sigma^2}2\mathbb E e^{tX}\tag3$$ But if $t\ge 0$, then by Jensen's inequality, $\mathbb Ee^{tX}\mathbb EtX\ge\mathbb E(e^{tX}tX)$, so $$\forall t\ge 0:\mathbb Ee^{tX}\log\mathbb Ee^{tX}-\mathbb E(e^{tX}tX)\le \frac{t^2\sigma^2}2\mathbb E e^{tX}\tag4$$ or in terms of your entropy we would have to take the negative of it to get the bound, $$\forall t\ge 0:-\operatorname{Ent}(e^{tX})\le \frac{t^2\sigma^2}2\mathbb E e^{tX}.\tag5$$