If a random variable has infinite variance, does it necessarily have infinite expectation or is its expectation not well-defined?
Does infinite variance imply infinite expectation?
552 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
The answer above is excellent but I provide a not so rigorous understanding into why this is the case.
Let us consider the sequence of random variables $X_n \sim $ Uniform($-n,n$)
We see that each $X_n$ is centred around $0$ and thus they all have mean $0$
The variance of a Uniform($a,b$) R.V is $\frac{1}{12}(b-a)^2$ so we see that the variance of our $X_n$ is $\frac{1}{3}n^2$
Hence $\mathbb{E}[X_n] = 0 $ $\forall n$
However Var$(X_n) =\frac{1}{3}n^2 \to \infty$
A similar thing could have be done with $Y_n \sim \mathcal{N}(0,n)$
Loosely speaking the concept of mean is its average value or where the random variable is centred (on mass) and variance talks about how far it deviates from this mean on average. However we can find a random variable that is clearly centred but deviates an infinite amount.
Importantly knowing how far something deviates and what it deviates from are uncorrelated. So the variance tells nothing of the mean.
You may notice that some parameters (like the $a$ and $b$ in our uniform or the $\lambda$ in a poisson) play a part in both the mean and variance so your intuition would (naturally) tell you that the two concepts are linked however they are not.
No, the fact that the variance (or, rather, second moment) is infinite tells you nothing about the expectation.
The examples above are for continuous distributions, but you have analogous examples for discrete distributions over $\mathbb{N}$ or $\mathbb{Z}$.