Looking for a counterexample in convergence of random variables

1k Views Asked by At

I was wondering is it possible that a sequence of $k$-dimensional random variables, $\{\mathbf{X_n}\}$ converges componentwise, but not jointly?

What I mean is $${X_{nj}}\xrightarrow{\text{dist}}{X_j}\;\text{ for } 1\le j\le k$$ but the joint distribution of $(X_{n1},\dots, X_{nk})$ doesn't converge, in distribution. I am not sure if such example exists, but it seems normal that it should, because convergence in distribution takes a lot of freedom away from the random variable, it makes them converge in a vast sense, but it takes away a lot of freedom, like it doesn't even respect addition of sequences. So is there a counterexample like this? Thanks for all the help.

1

There are 1 best solutions below

0
On

Let $X$ be a normally (standard) distributed random variable, take $X_{n1}=X$ and $X_{n2}=(-1)^nX$ for every $n$ then both sequences $(X_{n1})$ and $(X_{n2})$ weakly converge (as the sequence of related distributions is constant) and the sequence $(X_{n1},X_{n2})$ does not converge in distribution.