Marginal convergence with independence implies jointly convergence

1.8k Views Asked by At

If $X_n$ and $Y_n$ are independent random vectors for every $n$, then $X_n \overset{d}{\to} X$ and $Y_n \overset{d}{\to}Y$ imply that $(X_n,Y_n) \overset{d}{\to} (X,Y)$ where $X$ and $Y$ are independent.

I know the statement is true for $X_n, Y_n$ converges to $X, Y$ in probability without the assumption of independence. When I tried to prove this, I used the characteristic function, but I got stuck to show $X$ and $Y$ are independent.

2

There are 2 best solutions below

8
On

Hint: $Ee^{i((X_n, Y_n) \mid (u, v))} = Ee^{i (uX_n + v Y_n)}=Ee^{iu X_n} Ee^{i vY_n}$, now let $n \to \infty.$

5
On

Counterexample to what you are trying to prove: Let $X_1,Y_1,X_2,Y_2,\cdots$ be i.i.d. random variables with standard normal distribution. Then $X_n \to X_1$ in distribution and $Y_n \to X_1$ in distribution. Even though $X_n$ and $Y_n$ are independent for each $n$ it is not true that $X_1$ is independent of itself!.

The hypothesis that $X_n \overset {d} {\to} X$ does not determine a random variable $X$. The fact remains true if you replace $X$ by another random variable with the same distribution. So there is no question of proving that $X$ and $Y$ are independent (since we don't even know what those random variables are). You are supposed to assume that $X$ and $Y$ are independent (so that their joint distribution is uniquely defined) and then prove that $(X_n,Y_n)$ converges to $(X,Y)$ in distribution. Now use the answer by Will M.