Let $\langle \Omega, \mathcal{F}, P \rangle$ be a probability triple, $X \, : \, \Omega \rightarrow \mathbb{R}^+$ be a random variable (assume it takes only non-negative values for simplicity). In general, we can define expected value as a Lebesgue integral (“L” for Lebesgue, “RS” for Riemann–Stieltjes):
$E[X] \equiv (L) \int_{\Omega} X(\omega) \, dP \stackrel{?}{=} (RS) \int_{X(\Omega)} x \, dF(x)$.
How do we actually prove the last step?
We have our function $X(\omega)$ and our measure $P$, so thinking about Lebesgue integral is pretty natural. We can rewrite it:
$(L) \int_{\Omega} X(\omega) \, dP = (R) \int_{0}^{\infty} P \{ \omega \in \Omega: X(\omega) > t\} \, dt$,
and the integrand here seems to have something to do with CDF.
UPD: $E[X] < \infty$,
$\int_{0}^{\infty} P \{ \omega \in \Omega: X(\omega) > t\} \, dt = \int_{0}^{\infty} (1 - F(t)) \, dt = \lim_{u \to \infty} [(1 - F(u))u + \int_{0}^{u} x \, dF(x)]$.
Feels like it’s almost done. How can we show that the first term tends to zero ($0 \cdot \infty$)?