How to solve $\mathbb{E} \left[ \log(1+X) \right] = \int_{0}^{\infty}\frac{1-F_X(x)}{1+x}\,dx$.

53 Views Asked by At

I would like to know how can I solve $\mathbb{E} \left[ \log(1+X) \right] = \int_{0}^{\infty}\frac{1-F_X(x)}{1+x}\,dx$. Is there a paper I could refer to?

2

There are 2 best solutions below

3
On BEST ANSWER

Since providing a more general result needs little extra effort, let us do so.

Let $g$ be a continuously differentiable function on $[0,\infty)$ with $g' \geq 0$. Suppose that $X$ is a random variable with $\mathbb{P}(X \geq 0) = 1$. Then

\begin{align*} \mathbb{E}[g(X)] &= \mathbb{E}\left[ \int_{0}^{X} g'(x) \, dx \right] \\ &= \mathbb{E}\left[ \int_{0}^{\infty} g'(x) \mathbf{1}_{\{x < X\}} \, dx \right] \\ &= \int_{0}^{\infty} g'(x) \mathbb{E}\left[ \mathbf{1}_{\{x < X\}} \right] \, dx \\ &= \int_{0}^{\infty} g'(x) \left(1 - F_X(x)\right) \, dx, \end{align*}

where in the last step we used $\mathbb{E}\left[ \mathbf{1}_{\{x < X\}} \right] = \mathbb{P}(x < X) = 1 - F_X(x)$. Now your question corresponds to the case $g(x) = \log(1+x)$.


Remark. This proof works for any non-negative random variables and any such $g$, including the case $\mathbb{E}[g(X)] = \infty$. If in addition that $X$ is a continuous random variable, one can simply apply the integration by parts

\begin{align*} \mathbb{E}[g(X)] &= \int_{0}^{\infty} g(x) F_X'(x) \, dx \\ &= \left[ -g(x)(1-F_X(x)) \right]_{x=0}^{x=\infty} + \int_{0}^{\infty} g'(x) (1 - F_X(x)) \, dx \end{align*}

with an appropriate assumption on $g(x)$ so that $g(x)(1 - F_X(x)) \to 0$ as $x \to \infty$.

3
On

The difference between the two sides is $$\int_0^\infty F_X'(x)\ln(1+x)+\frac{F_X(x)-1}{1+x}dx=[(F_X(x)-1)\ln(1+x)]^\infty_0=0.$$