Can you think of a random variable $X$ where: $\operatorname{E}[X]$ exists, and $\operatorname{E}[X^2]$ doesn't?
I'm not sure if I remember correctly but I remember having heard in the lecture that If $\operatorname{E}[X]$ exists, then we can use the alternative formula for the variance:
$$ \operatorname{Var}[X] =\operatorname{E}[X^2] - (\operatorname{E}[X])^2 $$
However, when reviewing my notes I realised that the precondition for using this alternative notation for the variance different: the precondition is that $\operatorname{E}[X^2]<\infty$ must hold (which implies that $\operatorname{E}[X]$ exists).
Is there a way to prove that: $$ \operatorname{E}[X] \text{ exists} \Longrightarrow \operatorname{E}[X^2] \text{ exists} $$ Or is there a counterexample where $\operatorname{E}[X]$ exists, and $\operatorname{E}[X^2]$ doesn't?
As you may know $$\mu = \mathbb{E}[X] = \int x p(x)$$ and $$\sigma ^2 = \text{var}[X] = \mathbb{E}[X^2] - \mathbb{E}[X]^2 = \int x^2 p(x)- \left( \int x p(x) \right)^2$$ and it is not valid in general the assertion $$\mathbb{E}[X] \in \mathbb{R} \Rightarrow \mathbb{E}[X^2] \in \mathbb{R}$$ while the converse, namely $$\mathbb{E}[X^2] \in \mathbb{R} \Rightarrow \mathbb{E}[X] \in \mathbb{R}$$ is always true.
In fact there exist an infinite number of probability distributions in which the mean is defined, while the variance is not: one is the Pareto distribution, with parameter $\in ]1,2]$.