Two random variables converge in distribution and difference is decreasing in $L^2$. What can we say about them?

83 Views Asked by At

I am well aware that the convergence in distribution to a constant implies convergence in probability. However, is there any way to improve on this result? In particular, I am curious about whether for nice random variables we have the following?

Suppose that $E[|v_n-x_n|]>E[|x_{n+1}-v_{n+1}|]$ is decreasing and suppose that $v_n \to f$ and $x_n \to f$ in distribution. Can we show that $v_n, x_n$ converge to the same limit (assuming the limit exists) in probability, $L^2$ or a.s.?