I have two sequences of random variables $\{ X_n\}$ and $\{Y_n \}$. I know that $X_n \to^d D, Y_n \to^d D$. Can I conclude that $X_n - Y_n \to^p 0$?
If I cannot, what other conditions do I need for the conclusion to hold? Thanks.
I have two sequences of random variables $\{ X_n\}$ and $\{Y_n \}$. I know that $X_n \to^d D, Y_n \to^d D$. Can I conclude that $X_n - Y_n \to^p 0$?
If I cannot, what other conditions do I need for the conclusion to hold? Thanks.
On
Convergence in distribution is called weak convergence because all that matters is the frequency of values an RV takes. That says nothing about how a variable is correlated with another. For example, let X be uniform on $[0,1]$ and Y=1-X. They are identical in distribution, yet X cannot in general be expected to be close to Y.
No you cannot. Take $Z_n$ i.i.d Then $Z_n \Rightarrow Z_{\infty} =_d Z_1$. Set $Z_{2n} = X_{n}$ and $Z_{2n+1} = Y_n$. Clearly there is no convergence in probability of $X_n-Y_n$. Another issue is convergence in probability needs all your random variables to be defined on the same probability space, which just from convergence in distribution is not true in general. But even if they are on same probability space, you still cannot conclude as the counter-example shows.