I found the following theorem in Allan Gut, Probability theory (without proof): Let $X$ be a nonnegative random variable and $g$ a nonnegative differentiable strictly increasing function. Then $$Eg(X)=g(0)+\int_{(0,\infty)}g'(x) \mathbb{P}(X>x)\,dx,$$ and $$Eg(X)<\infty\Longleftrightarrow \sum_{n=1}^\infty g'(n)\mathbb{P}(X>n)<\infty.$$
The first statement I could prove (let $X\sim F$):
\begin{eqnarray*} Eg(X)=\int_{(0,\infty)} g(x)dF(x) & =\int_{(0,\infty)} \int_{(0,x)}g'(t)\,dt\,dF(x)+g(0)\\[8pt] & =g(0)+\int_{(0,\infty)}g'(t) \mathbb{P}(X>t)\,dt, \end{eqnarray*} by Fubini. But I am struggling with the second statement. My idea is the following: $$Eg(X)=\int_{(0,\infty)} g(x)\,dF(x)=\sum_{k=1}^\infty \int_{(n_k,n_{k+1})} g(x) \, dF(x).$$
for some increasing sequence satisfying $\cup_{k=1}^\infty [n_k,n_{k+1}] = \mathbb{R^+}$. But i cannot find the appropriate sequence. any suggestions? Thank you.
Without further assumptions on the monotonicity of the derivative the claim is in general not correct.
Example Let
$$h(x) := \begin{cases} 2 \left(n^2-\frac{1}{n^2} \right) \cdot (x-n)+\frac{1}{n^2} & x \in \left[n,n+\frac{1}{2} \right] \\ n^2- 2 \left(n^2-\frac{1}{(n+1)^2} \right) \left(x-n-\frac{1}{2} \right) & x \in \left[n+ \frac{1}{2},n+1 \right] \end{cases}$$
for $n \in \mathbb{N}$.
$\hspace{90pt}$
Then $h$ is a (strictly) positive continous function and therefore
$$g(x) := \int_0^x h(y) \, dy$$
defines a strictly increasing differentiable positive function. In particular, $g'(n)= \frac{1}{n^2}$ and $$g' (x) \geq \frac{n^2}{2}, \qquad x \in \left[n+\frac{1}{4},n+\frac{3}{4} \right]. \tag{1}$$
Now if we consider a random variable $X$ such that $\mathbb{P}(X > x) = \frac{1}{x}$ for $x$ sufficiently large, then $(1)$ shows that
$$\int_{(0,\infty)} g'(x) \mathbb{P}(X>x) \, dx = \infty.$$
On the other hand,
$$\sum_{n} g'(n) \mathbb{P}(X >n) \leq \sum_n g'(n) = \sum_n \frac{1}{n^2} < \infty.$$