Expectation of nonnegative random variable when passed through nonnegative increasing differentiable function

692 Views Asked by At

I am having trouble proving the following result:

Let $X$ be a nonnegative random variable and $g:\mathbb{R}\rightarrow\mathbb{R}$ a nonnegative strictly increasing differentiable function. Then

$$\mathbb{E}g(X)=g(0)+\int_{0}^{\infty}g^{\prime}(x)\mathbb{P}(X>x)dx$$

I know that it should follow using integration by parts, but using integration by parts in the more abstract setting of probability is a bit confusing to me. Details would be appreciated.

2

There are 2 best solutions below

7
On BEST ANSWER

$\mathbb E[g(X)] = \mathbb E[\int_0^{g(X)}dt] = \mathbb E[\int_{g(0)}^{g(X)}dt + \int_0^{g(0)} dt ] = \mathbb E[\int_{g(0)}^{g(X)}dt ] + \mathbb E[g(0)] = \mathbb E[\int_{g(0)}^{g(X)}dt ] + g(0) $

Now using the definition of expectation, we get: \begin{align*} \mathbb E\left[\int_{g(0)}^{g(X)}dt \right] &= \int_\Omega \int_{g(0)}^{g(X)}dt d\mathbb P(\omega) = \int_\Omega \int_0^\infty \chi_{(g(0),g(X(\omega))}(t)dtd\mathbb P(\omega)\\& = \int_0^\infty \int_\Omega \chi_{(g(0),g(X(\omega))}(t)d\mathbb P(\omega) dt =\int_0^\infty \mathbb P( g(0) < t <g(X)) dt \\&= \int_{g(0)}^\infty \mathbb P( g(0) < t <g(X)) dt \end{align*}

Use of Fubini due to all things being nonnegative (so we can swap order of integration).

Now, last thing $\mathbb P( t \in (g(0),g(X)) = \mathbb P( 0 <g^{-1}(t) < X) $

So, we get $\int_{g(0)}^\infty \mathbb P( g(0) < t <g(X)) dt = \int_{g(0)}^\infty \mathbb P(0 < g^{-1}(t) < X)dt = \int_0^\infty g'(s)\mathbb P( s < X) ds$

And we get $\mathbb E[g(X)] = g(0) + \int_0^\infty g'(s)\mathbb P(X>s)ds$

0
On

The general result is:

Claim: Let $g$ be differentiable. If $g$ and $g'$ are bounded, then $$ \bbox[5px,border:2px solid red] {E[g(X)]=g(0) +\int_0^\infty g'(x)P(X>x)\,dx-\int_{-\infty}^0 g'(x)P(X\le x)\, dx.}$$ The result also holds if $g$ is monotonic, provided the RHS is not $\infty-\infty$.

Proof: First suppose that $X$ is nonnegative. Write $$ g(X)-g(0)\stackrel{(1)}=\int_0^X g'(t)\,dt\stackrel{(2)}=\int_0^\infty g'(t) I_{X>t}\,dt.$$ Equality (1) is the fundamental theorem of calculus (remember $g$ is differentiable), while (2) is valid because the indicator random variable $I_{X>t}$ has value $1$ when $t<X$, and equals zero otherwise. Take expectation: $$ E[g(X)-g(0)]=E\left[\int_0^\infty g'(t) I_{X>t}\,dt\right]\stackrel{(3)}=\int_0^\infty g'(t)E[I_{X>t}]\,dt\stackrel{(4)}=\int_0^\infty g'(t)P(X>t)\,dt.$$ Identity (3) is the result of Fubini's theorem. In (4) we recognize that the expectation of the indicator of an event is the probability of the event.

Next, suppose $X$ is nonpositive. A similar argument shows $$E[g(X)-g(0)] = -\int_{-\infty}^0 g'(t)P(X\le t)\,dt.$$

For general $X$, write $$g(X)-g(0) = g(X^+)-g(0)+g(-X^-)-g(0)$$ where $X^+ := XI(X>0)$ is the positive part of $X$ and $X^-:=-XI(X<0)$ is the negative part. Apply the previous special cases to obtain $$ E[g(X^+) -g(0)]= \int_0^\infty g'(t)P(X^+>t)\,dt$$ and $$ E[g(-X^-)-g(0)]=-\int_{-\infty}^0 g'(t)P(-X^- \le t)\,dt.$$ To conclude, note that $\{X^+>t\}=\{X>t\}$ when $t>0$, and $\{-X^-\le t\} = \{X\le t\}$ when $t<0$.