Problem: Let $(\varepsilon_i)_{i\geq 1}$ be a sequence of iid random variables. Let $(x_i)_{i\geq 1}$ be a sequence of real numbers. Assume that:
- $\mathbb E[|\varepsilon_1|]<\infty$ and $\mathbb E[\varepsilon_1]=0$.
- $\frac{1}{n} \sum_{i=1}^n (x_i)^2$ is bouned for all $n$.
Prove that: $\lim_{n\rightarrow \infty} \frac{1}{n} \sum_{i=1}^n x_i \varepsilon_i=0$ with probability $1$.
Consequence: If we choose $x_i=1$ for all $i$. Then (loosely speaking) above problem recovers the strong law of large number.
My question: My teacher suggests that I can prove the problem using Kronecker's lemma. But, I have no idea how to apply it. Could anyone give me a hint. Thank you in advance!
Edit: Based on the useful comments below, the first condition is redundant. I mean: If the expectation is defined as Lebesgue Integral, then we have: $\mathbb E[\varepsilon_1]=0$ implies $\mathbb E[|\varepsilon|]<\infty$.
Therefore, the first condition should be rewritten: "$\mathbb E[\varepsilon_1]=0$ in the sense of Lebesgue Integral".
It is important to note that, the problem does not hold for Cauchy distribution. Indeed, if $\varepsilon_1$ has Cauchy distribution, then $\mathbb E[\varepsilon]$ is undefined in the sense of Lebesgue Integral (see mean of Cauchy).
The assumption involves the sum of squares of $x_i$, which would naturally appear if we look at the variance of partials sums of $x_i\varepsilon_i$; however, this is not allowed because $\varepsilon_i$ do not necessarily have a finite moment of order two. This issue will be overcame in the following way.
Step 1: it suffices to prove that $$ 2^{-N}\max_{1\leqslant n\leqslant 2^N}\left\lvert \sum_{i=1}^nx_i\varepsilon_i\right\rvert\to 0 \mbox{ a.s.}. $$ Step 2: in order to do that, we use truncation: for a fixed $N$, let $$ \varepsilon_{i,\leqslant}^N:=\varepsilon_i\mathbf{1}_{\{\lvert\varepsilon_i\rvert\leqslant 2^N \}}-\mathbb E\left[\varepsilon_i\mathbf{1}_{\{\lvert\varepsilon_i\rvert\leqslant 2^N \}}\right], $$ $$ \varepsilon_{i,\gt}^N:=\varepsilon_i\mathbf{1}_{\{\lvert\varepsilon_i\rvert> 2^N \}}-\mathbb E\left[\varepsilon_i\mathbf{1}_{\{\lvert\varepsilon_i\rvert>2^N \}}\right]. $$ Then $\varepsilon_i=\varepsilon_{i,\leqslant}^N+\varepsilon_{i,\gt}^N$ and it thus suffices to prove that $$\tag{1} Y_{N}:=2^{-N}\max_{1\leqslant n\leqslant 2^N}\left\lvert \sum_{i=1}^nx_i\varepsilon_{i,\leqslant}^{N}\right\rvert\to 0 \mbox{ a.s. and } $$ $$\tag{2} Z_N:=2^{-N}\max_{1\leqslant n\leqslant 2^N}\left\lvert \sum_{i=1}^nx_i\varepsilon_{i,\gt}^{N}\right\rvert\to 0 \mbox{ a.s.}. $$ Step 3: For (1): we show that $\sum_{N\geqslant 1}\mathbb E\left[Y_N^2\right]$ is finite. This follows from Kolomorov's maximal inequality, inequality $$ \mathbb E\left[\left(\varepsilon_{i,\leqslant}^N\right)^2\right]\leqslant 2 \mathbb E\left[\left(\varepsilon_i\mathbf{1}_{\{\lvert\varepsilon_i\rvert\leqslant 2^N \}}\right)^2\right], $$ the fact that $\varepsilon_i$ have the same distribution, the assumption on $x_i$ and that $\sum_{N\geqslant 1}2^{-N}\mathbf{1}_{\{1\leqslant \lvert \varepsilon_1\rvert\leqslant 2^N\}}\leqslant \frac 1X\mathbf{1}_{1\leqslant \lvert \varepsilon_1\rvert}$.
Step 4: in order to prove (2), we first notice that $2^{-N}\sum_{i=1}^{2^N}\lvert x_i\rvert \mathbb E\left[\varepsilon_i\mathbf{1}_{\{\lvert\varepsilon_i\rvert>2^N \}}\right]\to 0$. By the Borel-Cantelli lemma, we are reduced to prove that for each $\delta$, $$\sum_N\mathbb P\left(2^{-N}\max_{1\leqslant n\leqslant 2^N}\left\lvert \sum_{i=1}^nx_i\varepsilon_i\mathbf{1}_{\{\lvert\varepsilon_i\rvert> 2^N \}}\right\rvert>\delta\right)<\infty.$$ To this aim, we use the inclusion $$ \left\{2^{-N}\max_{1\leqslant n\leqslant 2^N}\left\lvert \sum_{i=1}^nx_i\varepsilon_i\mathbf{1}_{\{\lvert\varepsilon_i\rvert> 2^N \}}\right\rvert>\delta \right\}\subset\bigcup_{i=1}^{2^N}\{\lvert\varepsilon_i\rvert> 2^N \}.$$ Consequently, the previously mentioned probabilities can be controlled by a union bound and convergence of the aforementioned series follows from integrability of $\varepsilon_1$.