I'm trying to prove that if $\mathbb{E}\left[X\right]=\infty$ then $\mathbb{E}\left[X^{2}\right]=\infty$ for every random variable $X$.
I know that if $X(w)>1$ I'll get that $X^2(w)>X(w)$ so $\mathbb{E}\left[X^{2}\right]\ge\mathbb{E}\left[X\right]$
and if $X(w)\le1$ then $X^2(w)\le X(w)$ so
$\mathbb{E}\left[X^{2}\right]=\sum\nolimits _{w\in\Omega}X^{2}\left(w\right)\cdot p(w)\le\sum\nolimits _{w\in\Omega}1\cdot p(w)=1$
But how do I prove it when some of the $w\in \Omega$ are larger than 1 and some are less ?
Is this even the right way to prove this ?
By Jensen's inequality we know that for convex $f$
$$ f\left(\operatorname{E}[X]\right) \leq \operatorname{E}\left[f(X)\right] $$
The fact that $\operatorname{E}[X]=\infty \implies \operatorname{E}\left[ X^2 \right] = \infty$ follows from the observation that $f(x)=x^2$ is convex.
Edit: As referenced in the comments below, for this argument to be rigorous, one needs to multiply inside the expectation by $\mathbf 1(\vert X \vert \leq k)$, apply Jensen's inequality, and then take $k \to \infty$ with an appeal to the monotone convergence theorem.