I am looking to understand the intuition behind moments of random variables.
I understand the first moment relating to the mean and the second one relating to the variance. But what use does the n-th moment have?
Thank you
I am looking to understand the intuition behind moments of random variables.
I understand the first moment relating to the mean and the second one relating to the variance. But what use does the n-th moment have?
Thank you
Copyright © 2021 JogjaFile Inc.
Probably you also know that the third moment the asymmetry of the distribution (skewness), and the fourth moment tells you how the tails are behaved (kurtosis). But perhaps you can get some insight from the moment generating function
$$ M_X(t) = \mathbb{E}\left[e^{tX}\right] = \mathbb{E}\left[\sum_k \frac{t^k}{k!}X^k\right] = \sum_k \frac{t^k}{k!}\mathbb{E}[X^k] = \sum_k \frac{t^k}{k!}\left.\frac{{\rm d}^k M(t)}{{\rm d}t^k}\right|_{t=0} $$
If you know all moments $\mathbb{E}[X^k]$ (or equivalently $M_X(t)$) you could try to invert the problem and actually estimate the underlaying PDF, this is know as the moment problem