Suppose that we have an infinite double random array $\{X_{ij}:i\geq 1,j\geq 1\}$, and every elements in this array are independent identically distributed with mean zero and variance unit. We define $$Y_{i,n}=\frac{1}{n}\sum_{j=1}^nX_{ij}$$ and by the strong law of large number, for any $i\geq 1$, we have $$Y_{i,n}\to 0,\quad\text{a.s.}\qquad Y_{i,n}^2\to 0,\quad\text{a.s.}\qquad(n\to\infty)$$My question is that do we still have $$\frac{1}{n}\sum_{i=1}^nY_{i,n}^2\to 0\quad\text{a.s.}\quad?$$ First, I know that it is obviously wrong in arbitrary double array $\{a_{ij}:i\geq 1,j\geq 1\}$, for example: $$a_{ij}=\frac{2^i}{j},\quad\text{for any $i\geq 1$, we have}\quad b_{i,n}=\frac{1}{n}\sum_{j=1}^na_{ij}\to 0$$ and unfortunately, $$\frac{1}{n}\sum_{i=1}^nb_{i,n}^2\to\infty$$ However, in this problem, we have independent identically distributed structure, and I think it can be proved.
Convergence about an infinite double random array
99 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Here is another solution: Let $\mathcal{F}_n = \sigma(X_{i,j} : i, j \leq n)$ and
$$ S_{i,n} = \sum_{j=1}^{n} X_{i,j}, \qquad M_n = \frac{1}{n^3} \sum_{i=1}^{n} S_{i,n}^2. $$
Then $(M_n)$ is non-negative and adapted to $(\mathcal{F}_n)$ such that
\begin{align*} \mathbf{E}[M_{n+1} \mid \mathcal{F}_n] &= \frac{1}{(n+1)^3} \sum_{i=1}^{n+1} (S_{i,n}^2 + 1) \leq M_n + \frac{1}{(n+1)^2}. \end{align*}
Now let $\tilde{M}_n = M_n - \sum_{k\leq n} k^{-2}$. The above inequality shows that $(\tilde{M}_n)$ is a supermartingale with the lower bound
$$ \sup_n \mathbf{E}[(\tilde{M}_n)^-] \leq \sum_{k} k^{-2} < \infty . $$
So by Doob's martinglae convergence theorem, $(\tilde{M}_n)$ converges a.s., and so, $(M_n)$ also converges a.s. to a non-negative random variable $M_{\infty}$. Moreover, by Fatou's lemma,
$$ \mathbf{E}[M_{\infty}] \leq \liminf_{n\to\infty} \mathbf{E}[M_n] = \liminf_{n\to\infty} \frac{1}{n} = 0. $$
Therefore $M_{\infty} = 0$ a.s.
First step: we reduce the proof to the case where $X_{i,j}$ is bounded by a constant $R$. Let $X_{i,j;R}:=X_{i,j}\mathbf{1}_{\{\lvert X_{i,j}\rvert\leqslant R \}}-\mathbb E\left[X_{i,j}\mathbf{1}_{\{\lvert X_{i,j}\rvert\leqslant R \}}\right]$, $X'_{i,j;R}:=X_{i,j}\mathbf{1}_{\{\lvert X_{i,j}\rvert\gt R \}}-\mathbb E\left[X_{i,j}\mathbf{1}_{\{\lvert X_{i,j}\rvert\gt R \}}\right]$, $Y_{i,n;R}:=n^{-1}\sum_{j=1}^n X_{i,j;R}$ and $Y'_{i,n}:=n^{-1}\sum_{j=1}^n X'_{i,j;R}$. Define also the random variable $$ M_i:=\sup_{n\geqslant 1}\left\lvert Y'_{i,n}\right\rvert. $$ Then $$ \frac 1n\sum_{i=1}^n\left(Y'_{i,n}\right)^2\leqslant\frac 1n\sum_{i=1}^nM_i^2 $$ and since $\left(M_i\right)_{i\geqslant 1}$ is a stationary ergodic sequence, we derive that $$ \limsup_{n\to\infty}\frac 1n\sum_{i=1}^n\left(Y'_{i,n}\right)^2\leqslant\mathbb E\left[M_1^2\right]. $$ Moreover, by the maximal ergodic theorem, $$\mathbb E\left[M_1^2\right]\leqslant C\mathbb E\left[\left(X'_{1,1,R}\right)^2\right]\leqslant C'\mathbb E\left[X_{1,1}^2\mathbf{1}_{\{\lvert X_{1,1}\rvert>R\}}\right],$$ where $C$ and $C'$ are absolute constants. Consequently, $$ \limsup_{n\to\infty}\frac 1n\sum_{i=1}^n\left(Y_{i,n}\right)^2\leqslant \limsup_{n\to\infty}\frac 2n\sum_{i=1}^n\left(Y_{i,n;R}\right)^2+2C'\mathbb E\left[X_{1,1}^2\mathbf{1}_{\{\lvert X_{1,1}\rvert>R\}}\right]. $$ Consequently, if the bounded case is addressed, then it suffices to let $R\in\mathbb N$ going to infinity.
Second step: we assume that $(X_{i,j})_{i,j\geqslant 1}$ is i.i.d., centered and that $\lvert X_{i,j}\rvert\leqslant R$. Since $$ \frac 1n\sum_{i=1}^n Y_{i,n}^2\leqslant\varepsilon+\frac 1n\sum_{i=1}^n Y_{i,n}^2\mathbf{1}_{\{Y_{i,n}^2\gt\varepsilon\}}, $$ one has $$ \left\{\frac 1n\sum_{i=1}^n Y_{i,n}^2>2\varepsilon\right\}\subset\left\{\frac 1n\sum_{i=1}^n Y_{i,n}^2\mathbf{1}_{\{Y_{i,n}^2\gt\varepsilon\}}>\varepsilon\right\}\subset\bigcup_{i=1}^n\left\{ Y_{i,n}^2\gt\varepsilon \right\}. $$ Since $(Y_{i,n}^2)$, $1\leqslant i\leqslant n$, have the same distribution, it suffices to prove that for each positive $\varepsilon$, $\sum_{n\geqslant1 }n\mathbb P\left\{ Y_{1,n}^2\gt\varepsilon \right\}$, which follows from Hoeffding's inequality.