Existence of expected values of positive random variables

840 Views Asked by At

I found the following theorem in Allan Gut, Probability theory (without proof): Let $X$ be a nonnegative random variable and $g$ a nonnegative differentiable strictly increasing function. Then $$Eg(X)=g(0)+\int_{(0,\infty)}g'(x) \mathbb{P}(X>x)\,dx,$$ and $$Eg(X)<\infty\Longleftrightarrow \sum_{n=1}^\infty g'(n)\mathbb{P}(X>n)<\infty.$$

The first statement I could prove (let $X\sim F$):

\begin{eqnarray*} Eg(X)=\int_{(0,\infty)} g(x)dF(x) & =\int_{(0,\infty)} \int_{(0,x)}g'(t)\,dt\,dF(x)+g(0)\\[8pt] & =g(0)+\int_{(0,\infty)}g'(t) \mathbb{P}(X>t)\,dt, \end{eqnarray*} by Fubini. But I am struggling with the second statement. My idea is the following: $$Eg(X)=\int_{(0,\infty)} g(x)\,dF(x)=\sum_{k=1}^\infty \int_{(n_k,n_{k+1})} g(x) \, dF(x).$$

for some increasing sequence satisfying $\cup_{k=1}^\infty [n_k,n_{k+1}] = \mathbb{R^+}$. But i cannot find the appropriate sequence. any suggestions? Thank you.

2

There are 2 best solutions below

3
On BEST ANSWER

Without further assumptions on the monotonicity of the derivative the claim is in general not correct.

Example Let

$$h(x) := \begin{cases} 2 \left(n^2-\frac{1}{n^2} \right) \cdot (x-n)+\frac{1}{n^2} & x \in \left[n,n+\frac{1}{2} \right] \\ n^2- 2 \left(n^2-\frac{1}{(n+1)^2} \right) \left(x-n-\frac{1}{2} \right) & x \in \left[n+ \frac{1}{2},n+1 \right] \end{cases}$$

for $n \in \mathbb{N}$.

$\hspace{90pt}$enter image description here

Then $h$ is a (strictly) positive continous function and therefore

$$g(x) := \int_0^x h(y) \, dy$$

defines a strictly increasing differentiable positive function. In particular, $g'(n)= \frac{1}{n^2}$ and $$g' (x) \geq \frac{n^2}{2}, \qquad x \in \left[n+\frac{1}{4},n+\frac{3}{4} \right]. \tag{1}$$

Now if we consider a random variable $X$ such that $\mathbb{P}(X > x) = \frac{1}{x}$ for $x$ sufficiently large, then $(1)$ shows that

$$\int_{(0,\infty)} g'(x) \mathbb{P}(X>x) \, dx = \infty.$$

On the other hand,

$$\sum_{n} g'(n) \mathbb{P}(X >n) \leq \sum_n g'(n) = \sum_n \frac{1}{n^2} < \infty.$$

1
On

Too long for a comment: there is a slight problem using Tonelli's theorem in the derivation given by the OP, the problem is that it seems possible to build strictly increasing non-negative functions such that it derivative is not Lebesgue integrable in any interval of the form $[0,x]$. However this potential problem can be avoided following this other derivation:

$$ \begin{align*} \mathrm{E}[g(X)]&=\int_{(0,\infty )}\Pr [g(X)>y]\mathop{}\!d y\\ &=\int_{(0,g(0)]}\overbrace{\Pr [g(X)>y]}^{=1}\mathop{}\!d y+\int_{(g(0),\infty )}\Pr [g(X)>y]\mathop{}\!d y\\ &=g(0)+\int_{(g(0),\infty )}\Pr [g(X)>y]\mathop{}\!d y\\ &=g(0)+\int_{(0,\infty )}\Pr [X>x]\mathop{}\!d g(x),\quad \text{ using the change of variable }y=g(x)\\ &=g(0)+\int_{(0,\infty )}g'(x)\Pr [X>x]\mathop{}\!d x \end{align*} $$