X,Y are independent random variables with normal distribution, their expected value is equal to 0 and variance to 1/8. Calculate the expected value and variance of $Z=e^{(X+Y)^2}$
My approach $EX^2,EY^2$=1/8, since variance is equal to 1/8 and $EX=0$, $EXY=0$, then $E(X+Y)^2$=1/4 and here is my question, is it true that $Ee^{(X+Y)^2}$=$e^{E(X+Y)^2}$?
If so, why? If not, how do I get this done? And can this be broaden to the general f(X)?
In general, $E[g(X)]\ne g (E[X])$ (the equality holds if $g()$ is linear - or if the random variable is actually a constant).
I can think of two ways of attacking this.
First, express (if you can) $g(x)$ as a Taylor series (perhaps around the mean), then compute $E[g(X)]$ as an infinite sum involving the derivatives of $g()$ and the (centered) moments of $X$ (example).
Second, compute it directly.
I'd go for the second one here, because $g()$ looks related to the density.
First, a simplification: notice that $W=X+Y$ is normal $N(0,1/4)$ : $f_W(w)=\sqrt{\frac{2}{\pi}} \exp(-2 w^2)$
Then
$$E[Z]=E[e^{W^2}]= \sqrt{\frac{2}{\pi}}\int e^{w^2} e^{-2 w^2} dw$$
Can you go on from here?