Convergence in probability implies convergence in expectation.

249 Views Asked by At

I would like to see a full solution to the following problem. I have tried approaching it in a number of different ways, however I don't seem to get to the desired result. No point posting what I did, as they just lead to dead end roads.

Show that if $X_n\to 0$ in probability then: $$ E\left[\frac{|X_n|}{1+|X_n|}\right]\to 0, \mbox{ as } n\to\infty $$

The converse implication holds as well, but that one is easy to prove using Chebyshev's Inequality.

1

There are 1 best solutions below

0
On BEST ANSWER

Use Dominated Convergence Theorem. $\frac {|X_n|} {1+|X_n|}\leq 1$. To use the usual version of DCT you have to go to a.s. convergent subsequences, but DCT is valid for convergence in probability too. Alternatively, let $\epsilon \in (0,1)$ and $\delta =\frac {\epsilon } {1-\epsilon }$ Then $E\frac {|X_n|} {1+|X_n|} =E\frac {|X_n|} {1+|X_n|}I_{|X_n|>\delta }+E\frac {|X_n|} {1+|X_n|}I_{|X_n| <\delta } \leq EI_{|X_n|>\delta }+\frac {\delta }{1+\delta }=P\{|X_n|>\delta \} +\epsilon $. The result follows from this.