Let $(X_n)_n$ be a sequence of i.i.d random variable, $(x_n)_n$ be sequence of $\mathbb{R}^*.$ Let $y^2_n=\sum_{k=1}^nx^2_k.$
Suppose that $E[X_1]=0,E[X_1^2]=1,x_n=o(y_n),y_n \to +\infty.$ Prove that the sequence of the weighted random variable $(x_nX_n)_n$ fulfills the central limit theorem: $$\frac{1}{y_n}\sum_{k=1}^nx_kX_k \implies N(0;1).$$
Remark (optional): more generally, the following holds: $E[X_1^2]<+\infty$ if and only if there exist a sequence of real numbers $(w_n)_n$ such that $\frac{1}{y_n}\sum_{k=1}^nx_kX_k-w_n$ converges in distribution to an arbitrary random variable $Y.$ In this case, $Y$ is normal distributed.
A way to prove that the CLT is fulfilled, is to prove that Lindeberg condition holds. So let $\epsilon>0.$ $$\frac{1}{y_n^2}\sum_{k=1}^nx_k^2E[X_k^2 1_{|x_kX_k|>\epsilon y_n}]=\frac{1}{y_n^2}\sum_{k=1}^nx_k^2E[X_1^2 1_{|x_kX_1|>\epsilon y_n}],$$ can't see how to continue from here, especially how to remove $x_k$ from $1_{|x_kX_1|>\epsilon y_n}?$
Any suggestions are welcomed.
I think that we have to assume that $y_n\to\infty$, otherwise the convergence would reduce to the one of a series of independent random variables and the limit may not be normal.
Let$c_{n,k}:=E[X_1^2 1_{|x_kX_1|>\epsilon y_n}]$. For $n>k_0$, using non-decreasingness of $(y_n)$ and that for $1\leqslant k\leqslant n$, $0\leqslant c_{n,k}\leqslant c_{k,k}\leqslant 1$, we derive that $$ \frac{1}{y_n^2}\sum_{k=1}^nx_k^2c_{n,k}= \frac{1}{y_n^2}\sum_{k=1}^{k_0}x_k^2c_{n,k}+\frac{1}{y_n^2}\sum_{k=k_0+1}^{n}x_k^2c_{n,k}\leqslant \sum_{k=1}^{k_0}\frac{x_k^2}{y_k^2}+\sup_{k\geqslant k_0}c_{k,k} $$ hence for each $k_0$, $$\limsup_{n\to\infty} \frac{1}{y_n^2}\sum_{k=1}^nx_k^2c_{n,k} \leqslant \sup_{k\geqslant k_0}c_{k,k}. $$ Using the assumption $x_k/y_k\to 0$ finishes the proof.