Xn converges in probability to X does not necessarily imply convergence in mean,but is it true that Xn converges in probability to X iff $E( |Xn-X|/(1+(|Xn-X|)))$ goes to 0?
This could be a nice criteria to check the convergence in probability.So I am interested in this. Any suggestion would be appreciated.Thanks in advance.
The function $\psi(t)=t/(1+t)$ is increasing on $[0,\infty)$, so by Chebyshev's inequality, $$P(|X_n-X|>\epsilon)\le \frac {E\psi(|X_n-X|)}{\psi(\epsilon)}.$$
Suppose $E\psi(|X_n-X|)\to0$.For fixed $\epsilon>0$, $$\lim_{n\to\infty} P(|X_n-X|>\epsilon)\le \frac {\lim_{n\to\infty}E\psi(|X_n-X|)}{\psi(\epsilon)} = 0.$$ So $E\psi(|X_n-X|)\to0$ implies $ P(|X_n-X|>\epsilon)\to0$, that is, implies $X_n\to X$ in probability.
This proves the "if" part of "converges in probability if and only if $E[\psi(|X_n-X|)\to0$". Here is an argument for the converse. Observe that $\psi(t)\le\min(1,|t|)$. Suppose $X_n$ converges in probability to $X$. Then for each positive $\epsilon$, we have $P[|X_n-X|>\epsilon)<\epsilon$ for all $n$ sufficiently large. For such values of $n$, we have $\psi(|X_n-X|)\le\epsilon$ for all values whenever $[|X_n-X|\le\epsilon]$ holds, and in the contrary case, which holds with probability at most $\epsilon$, we have $\psi(|X_n-X|)\le1$. Putting these two together we get $$E[\psi(|X_n-X|)]\le \epsilon P[|X_n-X|\le\epsilon]+P[|X_n-X|>\epsilon]]\le2\epsilon.$$ This then implies $\lim_{n\to\infty} E[\psi(|X_n-X|)] = 0$.