What are the interest of the moments of a random variable?

354 Views Asked by At

Let $X$ a random variable. We define the moment of order $r\in\mathbb N$ by $m_r=\mathbb E[X^r]$. I know that the moment of order $1$ is the expectation, of order 2, one can get the variance, of order 3 and 4, they can be interesting, but in what moments of order $r$ is interesting ? Also, we define the moment generating function by $M_X(t)=\mathbb E[e^{tX}]$. Why such a definition, and not simply $$\sum_{r=0}^\infty m_rx^r\ \ ?$$ (normally, the moment generating function of a sequence $(u_n)$ is $f(t)=\sum_{n\in\mathbb N}u_nt^n$). Also, what is the interest of the moment generating function ? I really don't see why it's interesting (instead maybe of the fact that $M_X^{(r)}(0)=m_r.$)

1

There are 1 best solutions below

2
On BEST ANSWER

... but in what moments of order r is interesting?

One example: in statistics, moments of higher order may be needed in the method of moments.

Why such a definition, and not simply ...

The moment generating function of a random variable is not defined merely for calculating the moments of a random variable. It has other important properties such as $\phi_{X+Y}(t)=\phi_X\phi_Y(t)$ when $X$ and $Y$ are independent. (Maybe) most importantly, it characterizes a distribution!

Even in the studies of infinite sequences, exponential generating functions may be generally more convenient than ordinary generating functions in some situations.

... what is the interest of the moment generating function?

You could first read the Wikipedia article on moment generating function. Again, this is not simply a tool for calculating moments. You may also want to take a look at a more often used cousin: the characteristic function, which is essentially the Fourier transform of a random variable. A classical proof the central limit theorem uses the notion of characteristic functions.