Let $X_i \sim N(0, \sigma^2)$ and independent. I would like to show that $\frac{1}{n}\sum\limits_{i=1}^n(\sum\limits_{j=1}^i X_j)^2 \rightarrow \infty$ for $n \rightarrow \infty$ in probability. My problem is that the squared sums aren't independent. Does anyone have an idea which theorem or trick could be useful to solve this.
Many thanks in advance!
It is enough to show that $(\sum\limits_{i=1}^n X_i) ^{2} \to \infty$ in probability because $Y_n \to \infty$ in probability implies $\frac 1n \sum\limits_{i=1}^{n}Y_i \to \infty$ in probability.
Now let $Z=\frac 1 {\sqrt {n}} {(\sum\limits_{i=1}^n X_i)}$. Then $Z \sim N(0,\sigma^{2})$. Hence $P((\sum\limits_{i=1}^n X_i) ^{2} \leq M)=P(Z^{2} \leq \frac M n) \to 0$ as $ n \to \infty$ for each $M \in (0,\infty)$. This finishes the proof.
Note: $Z_n \to \infty$ in probability iff $Ee^{-Z_n} \to 0$ in probability. Hence $Ee^{-Y_n} \to 0$ in probability Since $e^{-x}$ is a convex function it follows that $Ee^{-\frac 1 n \sum\limits_{i=1}^{n}Y_i} \leq \frac 1 n \sum\limits_{i=1}^{n}Ee^{-Y_i} \to 0$. Hence $\frac 1n \sum\limits_{i=1}^{n}Y_i \to \infty$ in probability.