Almost sure convergence + convergence in distribution implies joint convergence in distribution?

1.4k Views Asked by At

I'm wondering, if I have two sequences of random variables $(X_n)$ and $(Y_n)$, defined on the same probability space, such that $X_n\stackrel{a.s.}{\rightarrow}X$ and $Y_n\stackrel{d}{\rightarrow}Y$, is it possible to conclude that they converge jointly in distribution, i.e. $$ (X_n,Y_n)\stackrel{d}{\rightarrow}(X,Y) $$ as $n\to\infty$?

I believe that this question is closely related to the following: if $Y_n\stackrel{d}{\rightarrow}Y$, is it true that $$ (X,Y_n)\stackrel{d}{\rightarrow}(X,Y) $$ as $n\to \infty$?

Thank you in advance for any thoughts, comments etc.!

2

There are 2 best solutions below

2
On BEST ANSWER

For your second question, take $Y_n = X$, and $Y$ to be independent of $X$ with the same distribution. This implies that the answer to your first question is also no.

4
On

Convergence in distribution is the weakest form of convergence. Thus, $X_{n}\rightarrow_{a.s} X$ implies that $X_{n}\rightarrow_{d} X$. One can then prove from this that $(Y_{n},c)\rightarrow_{d}(Y,c)$ for any constant $c$ by using the definition of convergence: $E(f(Y_{n},c))\rightarrow E(f(Y,c))$ for any bounded function $f$.

Then, simply show that $(Y_{n},X_{n})\rightarrow_{d}(Y_{n},X)$ which is obvious using the definition of convergence in probability(which again implies convergence in distribution). Thus, you obtain that $(Y_{n},X_{n})\rightarrow_{d} (Y_{n},X)\rightarrow_{d} (Y,X)$ where here we take $X=c$ from the previous paragraph. Thus you have your result that $(Y_{n},X_{n})\rightarrow_{d}(Y,X)$.