Let $X_i$ and $Y_i$, $i\in\mathbb{N}$, be random variables. I want to show that (asymptotic normality) $$\sqrt{n}\bigg(\frac{1}{d_n}\sum_{k=1}^{d_n}X_k +Y_k\bigg)\overset{d}{\to} N(0,\sigma^2), n\to\infty.$$
The problem is that, in my case, $X_t$ is an ugly expression, and I'm struggling to determine the form of the variance $\sigma^2$. Although, I know that $\sqrt{n}/d_n\sum_{k=1}^{d_n}X_k\overset{p}{\to} 0$, i.e., this term is $o_p(1)$. In addition, $\sqrt{n}/d_n\sum_{k=1}^{d_n}Y_k\overset{d}{\to} N(0,\sigma_1^2)$, where $\sigma_1^2$ is completely known. Well, Slutsky's theorem says that if $Z_1,Z_2$ are random variables such that $Z_1\overset{d}{\to}Z, Z_2\overset{p}{\to}c $ then $Z_1+Z_2\overset{d}{\to}c+Z$, for some constant $c$. I conclude that $$\sqrt{n}\bigg(\frac{1}{d_n}\sum_{k=1}^{d_n}X_k +Y_k\bigg)\overset{d}{\to} N(0,\sigma_1^2), n\to\infty.$$
I suspect that there is something wrong with this argument since I simply ignored the dependence/covariance of $X_k$ and $Y_k$. Can you give me feedbacks on this? Is there somthing wrong with this?
Thanks in advance!
You wrote that you know that:
$$\sqrt{n}/d_n\sum_{k=1}^{d_n}X_k\overset{d}{\to} N(0,\sigma_1^2),$$
but maybe you mispell $X$ for $Y$. If you meant actually $Y$, you are under the hypothesis of Slutsky, and you can ignore correlation.
If not, of course you don't know (yet) anything about $Y$, and here comes the hard work.