I've proved the next:
Suppose that $X_{n}$ and $Y_{n}$ are independent random variables and $X$ and $Y$ are too. Then, if $X_{n}\implies X$ and $Y_{n}\implies Y,$ we have that $(X_{n},Y_{n})\implies (X,Y).$ As a consequence $X_{n}+Y_{n}\implies X+Y.$
Here $\implies$ represents convergence in distribution.
The proof is quick because of characteristic function and independence of the random variables.
I've been thought a counterexample in which the hypothesis of independence is dropped, but I haven't got success in this.
Any kind of help is thanked in advanced.
Let X have standard normal distribution, $X_n=X$ for all n, $Y=X$ and $Y_n=-X$ for all n. Since standard normal distribution is symmetric $Y_n$ has the same distribution as $Y$ for each n. Now $X_N +Y_n =0$ whereas $X+Y=2X$ is normal with mean 0 and variance 4.