I was reading about the moment generating function and I understand how i can find any moment with it. I was wondering however what happens if i evaluate the function at a particular value in its domain ( $\mathbb{R}$ ). Is it of any use?
Thank you.
I was reading about the moment generating function and I understand how i can find any moment with it. I was wondering however what happens if i evaluate the function at a particular value in its domain ( $\mathbb{R}$ ). Is it of any use?
Thank you.
If $\psi$ is the moment generating function of $X$, as you have discovered $E(X^k)=\psi^{(k)}(0)$. The $k$th moment can be found by evaluating the $k$th derivative of the mgf at $0$.
Example of a use of mgf's on $\mathbb R^+$:
In calculation of Chernoff bounds, which are sharper than Chebyshev's inequality for high $n$, we often want to find something like
$$\Pr\left(|\bar X-\mu|>u\right)\le\text{some bound}$$
The Chebychev inequality gives this bound as $\frac{\operatorname{Var}(X)}{nu^2}$. The Chernoff bound can be found by bounding
$$\Pr(X-n\mu>nu)+\Pr(-(X-n\mu)>nu)$$
If $X$ is the sum of iid random variables/ has mgf $\psi$, then $\Pr(X-n\mu>nu)\le\min_{t>0}\psi(t)\exp{-nt\mu}\exp{-ntu}$ and $\Pr(-(X-n\mu)>nu)\le\min_{t>0}\psi(-t)\exp{nt\mu\exp{-ntu}}$ so we would find the $t>0$ that minimizes an expression which involves the mgf of $X$ and the overall bound is the sum of the expressions evaluated at said $t$'s.
In general, Chernoff bounds say $\Pr(X>t)\le\min_{s>0}\psi(s)\exp{-st}$ for any $t\in \mathbb R$.