Does component-wise convergence in distribution imply the random vector converges in distribution?

2.4k Views Asked by At

Suppose $X_1, X_2, \cdots$ be a sequence of random variables such that $X_n \xrightarrow{d} X$. Similarly, $Y_1, Y_2, \cdots$ be a sequence of random variables such that $Y_n \xrightarrow{d} Y$. Is it true that $\left(X_n, Y_n\right) \xrightarrow{d} \left(X,Y\right)$? Also, is the converse true?

2

There are 2 best solutions below

11
On

No. If $X_n=X=-Y_n=Y$ for all $n$ where $X$ has normal distribution with mean $0$ and variance $1$ then $(X_n,Y_n)$ does not converge in distribution to $(X,Y)$. This is because $X_n+Y_n$ does not tend to $X+Y$ in distribution. Converse is true and it follows from the fact that $(x,y) \to x$ and $(x,y) \to y$ are continuous maps.

0
On

Not in general.

It might be e.g. that $X_n$ and $Y_n$ are not defined on the same probability space so that $(X_n,Y_n)$ is not properly defined. This also for $X$ and $Y$.

Be aware that here we are dealing with convergence of distributions and not particulary of the random variables that at most represent them (and actually can be missed). On that point there is an essential difference with convergence in probability and with convergence almost surely.

And even if that's avoided by presuppositions then the answer stays "no" (see the answer of Kavi).

If conversely $(X_n,Y_n)$ and $(X,Y)$ are random vectors with $(X_n,Y_n)\stackrel{d}{\to}(X,Y)$ then indeed also $X_n\stackrel{d}{\to}X$ and $Y_n\stackrel{d}{\to}Y$.