Convergence in distribution of sum of two random sequence

30 Views Asked by At

Let $\{(x_i,y_i)\}_{i=1}^\infty$ be a sequence of iid random variables, with $E|x_i|<\infty$, $E(y_i)=0$, and $E(x_iy_i)=0$. Both $y_i$ and $x_iy_i$ have finite second moment. And let

$$ a_n=\frac{1}{\sqrt{n}}\sum_{i=1}^{n}x_iy_i, \quad b_n=\frac{1}{\sqrt{n}}\sum_{i=1}^{n}y_i, \quad\text{and}\quad c_n = \frac1n\sum_{i=1}^{n}x_i. $$

By LLN, CLT, and Slutsky's theorem, $a_n\overset{d}{\to}N(0,E(x_i^2y_i^2))$, $b_nc_n\overset{d}{\to}N(0,E(x_i)^2E(y_i^2))$.

My question is, is it possible to obtain the limiting distribution of $a_n-b_nc_n$? Thanks.

My thinking: We can rewrite $b_nc_n$ as $\frac1n\sum_{i=1}^{n}y_i\frac{1}{\sqrt{n}}\sum_{i=1}^nx_i$, which is $o_p(1)O_p(1)=o_p(1)$, so $a_n-b_nc_n = a_n + o_p(1)$ has the same limiting distribution as $a_n$. But I'm no so sure whether this argument is correct.