"Inverse" moment generating function of standard normal distributed random variable

348 Views Asked by At

This is just a trivial question maybe but, is the Moment generating function for $X$ the same as for $-X$ for a normally distributed random variable, so $E(e^{tX})=E(e^{-tX})$? If not, what is the difference between them?

3

There are 3 best solutions below

2
On BEST ANSWER

If $$X \sim \operatorname{Normal}(\mu = 0, \sigma^2),$$ then yes, $\operatorname{E}[e^{tX}] = \operatorname{E}[e^{-tX}]$. Otherwise, it is not true.

We can perform the computation explicitly: $$\begin{align*} \operatorname{E}[e^{tX}] &= \int_{x=-\infty}^\infty e^{tx} \frac{e^{-x^2/(2\sigma^2)}}{\sqrt{2\pi} \sigma} \, dx \\ &= \int_{x=-\infty}^\infty \frac{1}{\sqrt{2\pi} \sigma} e^{-(x^2 - 2\sigma^2 t x + (\sigma^2 t)^2)/(2\sigma^2)} e^{\sigma^2 t^2/2} \, dx \\ &= e^{\sigma^2 t^2/2} \int_{x=-\infty}^\infty \frac{e^{-(x-\sigma^2 t)^2/(2\sigma^2)}}{\sqrt{2\pi}\sigma} \, dx \\ &= e^{\sigma^2 t^2/2}, \end{align*}$$ since the last integrand is the density of a normal distribution with mean $\sigma^2 t$ and variance $\sigma^2$, thus integrates to $1$. It follows that $$\operatorname{E}[e^{-tX}] = e^{\sigma^2 (-t)^2/2} = e^{\sigma^2 t^2/2} = \operatorname{E}[e^{tX}].$$

0
On

Clearly, if $X$ has a distribution symmetric around zero, for normals iff $\mu=0$, then it holds. Nothing about the MGF is special, you're simply using $X∼−X$.

0
On

For any random variable $X$, letting $M(t)=E(e^{tX})$ then $E(e^{-tX})= M(-t)$ so your question is essentially: when does $M(t)=M(-t)$, i.e. when is the moment generating function an even function?

For the normal distribution, $M(t) = e^{\mu t + \frac{1}{2}\sigma^2 t^2}$ so $M(t)$ is even only when $\mu = 0$.