Suppose we have two sequence of random variables $X_n$ and $Y_n$ such that they converge in distribution respectively to $X\sim\Gamma(\frac{1}{2},1)$ and $Y\sim\Gamma(\frac{1}{2},1)$. $X_n$ and $Y_n$ are respectively sequences of random variables, but unfortunately, in general, $X_n$ is not indipendent in respect to $Y_n$.. although experimentally I see that as $n$ bicame larger and larger $X_n$ and $Y_n$ became more and more indipendent: I see that for all $B_1,B_2\in\mathcal{B}(\mathbb{R})$ we have, as $n$ increases, $P(X_n\in B_1,Y_n\in B_2) \approx P(X_n\in B_1)\cdot P(Y_n\in B_2)$. In particular $X_n = \frac{1}{n}\Bigl(\sum_{i = 1}^{n}\gamma_i\Bigr)^2$ and similarly $Y_n = \frac{1}{n}\Bigl(\sum_{i = 1}^{n}\eta_i\Bigr)^2$, where $\{\gamma_i\}_{i\in\mathbb{N}}$ are i.i.d. random variables as $\{\eta_i\}_{i\in\mathbb{N}}$, but $\gamma_i$ is not indipendent in respect to $\eta_i$.
Question
What we can say about the convergence in distribution of the sequence $X_n+Y_n$? Again, experimentally I see that $X_n+Y_n\to W\sim\Gamma(1,1)$ (like if they were indipendent!). I suspect that since the indipendence between $X_n$ and $Y_n$ somehow became weaker and weaker as $n$ increases, there must exist some theorems that guarantees this convergence, although I was not able to find any during my research.
Do you know some conditions that $X_n$ and $Y_n$ must satisfy to guarantee the convergence of their sum in distribution? Thank you in advance.
Suppose $X_n \stackrel{d}\to X$ and $Y_n \stackrel{d}\to Y$. Also, consider the following seemingly different formulations of "asymptotic independence" between $(X_n)$ and $(Y_n)$:
$\mathbf{P}(X_n \leq x, Y_n \leq y) \to \mathbf{P}(X \leq x)\mathbf{P}(Y \leq y)$ whenever $\mathbf{P}(X=x) = 0$ and $\mathbf{P}(Y = y) = 0$.
$\mathbf{P}(X_n \in A, Y_n \in B) \to \mathbf{P}(X \in A)\mathbf{P}(Y \in B)$ for any Borel subsets $A$ and $B$ of $\mathbb{R}$ satisfying $\mathbf{P}(X\in\partial A) = 0$ and $\mathbf{P}(Y \in \partial B) = 0$.
$\mathbf{E}[\phi(X_n)\psi(Y_n)] \to \mathbf{E}[\phi(X)]\mathbf{E}[\psi(Y)] $ for any bounded Lipschitz functions $\phi$ and $\psi$ on $\mathbb{R}$.
It is not hard to check that these conditions are equivalent. (This is more or less a 2-dimensional version of Portumanteau theorem.) Moreover, if any of these conditions hold, then by setting $\phi(\bullet) = \psi(\bullet) = e^{i\xi \bullet}$ for each $\xi \in \mathbb{R}$, we get
$$ \varphi_{X_n+Y_n}(\xi) = \mathbf{E}[e^{i\xi X_n}e^{i\xi Y_n}] \to \mathbf{E}[e^{i\xi X}]\mathbf{E}[e^{i\xi Y}] = \varphi_{X}(\xi)\varphi_{Y}(\xi). $$
Note that the right-hand side is the characteristic function of the convolution of the distribution of $X$ and $Y$. In other words, $X_n + Y_n \stackrel{d}{\to} X + Y$ assuming $X$ and $Y$ are independent.