Proving $E\left[G(X)\right]=\int_0^\infty P(X\ge t)\,dG(t)$ where $X\ge 0$ and $G(\ge 0)$ is an increasing, right-continuous function

87 Views Asked by At

Suppose $G$ is an increasing, right-continuous function such that $G(x)\ge 0$ for all $x\ge 0$ and $G(0)=0$. If $X$ is an arbitrary non-negative random variable, then I am trying to show that

$$E\left[G(X)\right]=\int_0^\infty P(X\ge t)\,dG(t) \tag{1}$$

Since $G(X)\ge 0$ a.e., comparing with this popular question, I think I should have

$$E\left[G(X)\right]=\int_0^\infty P(G(X)>t)\,dt \tag{2}$$

But does $(2)$ reduce to $(1)$ from a change of variables?

Starting from scatch, if $F$ is the distribution function of $X$, then I have $$E\left[G(X)\right]=\int_0^\infty G(x)\,dF(x)=\int_0^\infty \left\{\int_0^{G(x)}\,dy\right\}dF(x)$$

Here I thought of using Fubini's theorem but I am not sure how to proceed with that.

Does $G^{-1}$ exist? If $G^{-}$ is some sort of generalized inverse of $G$, then I could perhaps say $$0< y< G(x),0<x<\infty\implies G^{-}(y)< x<\infty,0<y<G(\infty)$$

Using integration by parts I think it is true that $$\int_0^\infty G(x)\,dF(x)=\int_0^\infty P(X\ge t)\,dG(t)$$

But then I am not sure how the conditions on $G$ come into play. Any hint would be great.

1

There are 1 best solutions below

2
On

This is a simple application of Fubini/Tonelli's Theorem: $\int_0^{\infty} P(X \geq t)dG(t)=\int_0^{\infty}\int I_{t \leq X} dPdG(t)= \int \int_0^{\infty} I_{t \leq X} dG(t)dP=\int G(X) dP=E[G(X)]$.