Independence of random variables is necessary to have convergence in distribution

66 Views Asked by At

I've proved the next:

Suppose that $X_{n}$ and $Y_{n}$ are independent random variables and $X$ and $Y$ are too. Then, if $X_{n}\implies X$ and $Y_{n}\implies Y,$ we have that $(X_{n},Y_{n})\implies (X,Y).$ As a consequence $X_{n}+Y_{n}\implies X+Y.$

Here $\implies$ represents convergence in distribution.

The proof is quick because of characteristic function and independence of the random variables.

I've been thought a counterexample in which the hypothesis of independence is dropped, but I haven't got success in this.

Any kind of help is thanked in advanced.

2

There are 2 best solutions below

1
On BEST ANSWER

Let X have standard normal distribution, $X_n=X$ for all n, $Y=X$ and $Y_n=-X$ for all n. Since standard normal distribution is symmetric $Y_n$ has the same distribution as $Y$ for each n. Now $X_N +Y_n =0$ whereas $X+Y=2X$ is normal with mean 0 and variance 4.

1
On

Let $X_n = \begin{cases} 1 & \text{with probability } \frac12 \\ -1 & \text{with probability } \frac12\end{cases}$

and $Y_n = -X_n$, then $X_n + Y_n $ is zero (deterministic). but $X+Y$ is not deterministic.