Proving almost sure convergence for normalized gaussian series

185 Views Asked by At

Given a sequence $(b_{ij})_{i < j \in \mathbf{N}} \in \ell_2$ (so it can also be represented as an infinite upper triagular matrix if you may), prove that the following series converges almost surely to $0$:

$$ \dfrac{1}{\sqrt{n}} \sum_{i<j =1}^n b_{ij} X_i X_j$$

for $(X_k)_{k \in \mathbf{N}}$ standard iid Gaussians. We are operating in a probability space.

I have the result for $L^2$ (so measure and subsequential almost surely) but can't get it for almost sure convergence. I considered the law of large numbers obviously but it's evident that we don't meet the independence criteria for the series' terms. I tried to see if there was some clever way to break the sum up into multiple sums that guarantee independence (mutual or pairwise) between terms but wasn't able to formulate so in a closed form.

1

There are 1 best solutions below

0
On BEST ANSWER

Let $Y_n:=\sum_{1\leqslant i<j\leqslant n}b_{i,j}X_iX_j$. Observe that by independence and the fact that $X_i$ is centered, the random variables $X_iX_j$ and $X_{i'}X_{j'}$ are uncorrelated if $(i,j)\neq (i',j')$. Therefore, $$ \mathbb E\left[Y_n^2\right]=\sum_{1\leqslant i<j\leqslant n}b_{i,j}^2 $$ hence $$\tag{*} \mathbb E\left[\left(\frac{1}{\sqrt n}Y_n\right)^2\right]\leqslant \frac 1n\sum_{1\leqslant i<j}b_{i,j}^2. $$ This shows that $Y_n/\sqrt n$ goes to $0$ in probability. Since $\sum_n 1/n$ is divergent, we cannot use directly the Borel-Cantelli lemma to derive the almost sure convergence. However, applying $(*)$ to $n=2^N$, we get that $$ \mathbb E\left[\left(\frac{1}{2^{N/2}}Y_{2^N}\right)^2\right]\leqslant 2^{-N}\sum_{1\leqslant i<j}b_{i,j}^2. $$ Using Doob's inequality with the martingale difference sequence $d_j:=\sum_{i=1}^{j-1}b_{i,j}X_iX_j$, it is possible to show that $$ \mathbb E\left[\left(\frac{1}{2^{N/2}}\max_{2\leqslant n\leqslant 2^N}\lvert Y_{n}\rvert\right)^2\right]\leqslant 2^{1-N}\sum_{1\leqslant i<j}b_{i,j}^2. $$ which is sufficient to derive the almost sure convergence to $0$.