Is it the case that $E[\exp(X)] = \exp(E[X])$, where $X$ is a random variable?
I know this is too simple, but I must be googling the wrong things.
Is it the case that $E[\exp(X)] = \exp(E[X])$, where $X$ is a random variable?
I know this is too simple, but I must be googling the wrong things.
On
Let $X$ be zero or one each with probability $1/2$. Then $E(X)=1/2$ and $E(\exp(X))=\frac12(1+e)$. Does that equal $\exp(E(X))$?
On
The exponential function is a convex function. So by Jensen's Inequality $$\Bbb{E}(e^X)\le e^{\Bbb{E}(X)}$$
Equality will only hold if $X$ is degenerate.
On
In general, note that $$ \mathbb{E}\left[e^X\right] = \int_{-\infty}^\infty e^x f(x) dx \ne \exp \left( \int_{-\infty}^\infty xf(x) dx\right) = e^{\mathbb{E}[X]}. $$
It's easy to pick a counterexample, e.g. $x \sim \mathcal{U}(0,1)$, then the left side becomes $$ \mathbb{E}\left[e^X\right] = \int_0^1 e^x dx = e-1 $$ and the right-hand side is $$ e^{\mathbb{E}[X]} = \exp \left( \int_0^1 x dx\right) = \sqrt{e}. $$
If $X$ is a constant random variable, then, both sides will be always the same. More generally for equality you need a sufficiency condition that will equate the 2 above integrals.
That won't be true for a general random variable. E.g. if $X$ is a simple Bernoulli random variable: $\mathbb P[X=\pm1]=\frac{1}{2}$ then $E[X]=0$, $$\exp(E[x])=\exp(0)=1$$ and $$E[\exp(X)]=\frac{1}{2}\left(\exp(1)+\exp(-1)\right).$$
If $T$ is a linear map then $E[TX]=TE[X]$, but of course $\exp$ isn't linear!