I'm stuck with a homework problem where we are supposed to prove that the expected value $E[X^k]$, if $X$ has standard normal distribution, is equal to: $$E[X^{2k}]=\frac{(2k)!}{k!\cdot2^k}.$$ But I cannot think of the correct approach. Can anyone help me?
all the best :)
Marie
So, here is the solution:
We know the MGF, $$M_X(t)=E[e^{tX}],$$ and if we find the derivative w.r.t. $t$ of both sides at $t=0$, we get that $$M^{(k)}_X(0)=E[X^k \cdot e^0].$$ So we try to investigate the function $e^{\frac{t^2}{2}}$ (and its derivatives) at $t=0$, and we use the fact that $$e^x=\sum_{j=0}^{\infty}\frac{x^j}{j!}.$$ Hence $$e^{\frac{t^2}{2}}=\sum_{j=0}^{\infty}\frac{ \left( \frac{t^2}{2} \right)^j}{j!}=\sum_{j=0}^{\infty} \frac{t^{2j}}{j!\cdot 2^j}.$$ Now we find the $(2k)^{th}$ derivative, and evaluate at $t=0$. But since $t=0$, we have that all summands that contain factor $t$ are zero, which leaves only one: The coefficient of $t^0$ in the $(2k)^{th}$ derivative is what we want. But this is equal to $(2k)!$ times the coefficient of $t^{2k}$ in the above sum, hence $\frac{(2k)!}{k!\cdot 2^k},$ as desired.
All the best! marie :)