Average of Gaussian Converge to One almost Surely

143 Views Asked by At

I am trying to solve for the following problem:

Let $g_1, g_2, \cdots$ be i.i.d. standard Gaussian random variables. Let $(a_{ij})_{i < j}$ be in $\ell^2$ i.e. $\sum_{i < j} a_{ij}^2 < \infty$. Let $X_n = \frac{1}{n}\sum_{i = 1} ^n g_i^2 + \frac{1}{\sqrt{n}}\sum_{1 \leq i < j \leq n} a_{ij}g_ig_j$. Show that the sequence $(X_n)_{n \geq 1}$ converges to $1$ a.s.

I am really stuck on this problem and need some hints to at least move in the right direction. I am thinking this might be related to Komogorov's zero-one law, but my class has not covered the theorem. Any hints to help me move in the right direction would be appreciated.

3

There are 3 best solutions below

1
On BEST ANSWER

1. By SLLN,

$$ \frac{1}{n} \sum_{i=1}^{n} g_i^2 \to \mathbf{E}[g_1^2] = 1 \qquad \text{a.s.} $$

2. Let $Y_n = \sum_{i < j \leq n} a_{ij} g_i g_j$, and let $\mathcal{F}_{n} = \sigma(g_i : i \leq n)$ be the $\sigma$-algebra generated by $g_1,\ldots,g_n$. Then $(Y_n)$ is a $(\mathcal{F}_n)$-martingale, since

$$ \mathbf{E}[Y_{n+1} \mid \mathcal{F}_n] = Y_n + \sum_{i=1}^{n} a_{i,n+1} \mathbf{E}[g_i g_{n+1} | \mathcal{F}_n] = Y_n. $$

Moreover,

$$ \mathbf{E}[Y_n^2] = \sum_{\substack{i < j \leq n \\ k < l \leq n}} a_{ij}a_{kl} \mathbf{E}[g_i g_j g_k g_l] $$

and only the summands with $(i, j) = (k, l)$ survives, yielding

$$ \mathbf{E}[Y_n^2] = \sum_{i < j \leq n} a_{ij}^2 \leq \sum_{i<j} a_{ij}^2. $$

Therefore $(Y_n)$ is an $L^2$-martingale and hence converges a.s. by Doob's martingale convergence theorem. In particular, $\frac{1}{\sqrt{n}} Y_n \to 0$.

1
On

I have only a partial answer. Denote $Y_n=X_n-\frac{1}{n}\sum_{1}^ng_i^2.$ From the strong law of large numbers $\frac{1}{n}\sum_{1}^ng_i^2$ converge to $1$ almost surely, but I am able to prove only that $Y_n$ tends to zero weakly as follows: clearly $E(Y_n)=0$ and $E(Y_n^2)=\frac{1}{n}\sum_{i<j<n}a^2_{ij}\to_{n\to \infty}0$ getting the Tchebychev inequality $$\Pr(|Y_n|>\epsilon)\leq \frac{1}{\epsilon^2}E(Y_n^2);$$ this is not enough for using Borel Cantelli and getting almost sure convergence of $Y_n$ to $0.$

1
On

It is a result due to Kolmogorv and Khintchine that if $X_{n}$ is a sequence of independent random variables such that $\sum_{n}Var(X_{n})<\infty$ then $\sum_{k=1}^{\infty}X_{k}-E(X_{k})$ converges almost surely to some random variable which is finite almost surely. See this question here for more.

A short sketch of the proof is given below.

Well, the reason for the above is that if we WLOG assume that $E(X_{k})=0$ (else consider $X_{k}-E(X_{k})$) for each $k$, then $||S_{n}-S_{m}||_{L^{2}}=\sum_{k=n}^{m}Var(X_{k})\leq\sum_{k=n}^{\infty}Var(X_{k}) > \xrightarrow{n\to\infty}0$ .Which means that $S_{n}$ is $L^{2}$ Cauchy. Now this is due to a theorem due to Levy that $S_{n}$ being Cauchy in probability implies it is Cauchy almost surely. (where $S_{n}$ is the sequence of partial sums of independent random variables).

Let $\displaystyle S_{n}=\sum_{k=1}^{n}X_{k}$

This idea is also similar to what Sangchul Lee's answer suggests. Basically, since $E(X_{k})=0$, then what we have shown implies $S_{n}$ is an $L^{2}$ Cauchy martingale which implies convergence to a random variable $X_{\infty}\in L^{1}$ and hence is finite almost surely.

Anyways, getting back to your problem,

First see that $\frac{1}{n}\sum_{i = 1} ^n g_i^2\xrightarrow{a.s.}1 $ due to SLLN.

Next, your condition directly implies that $\sum_{i<j}Var( a_{ij}g_ig_j)=\sum_{i<j}a_{ij}^{2}<\infty$.

Hence $\sum_{1 \leq i < j \leq n} a_{ij}g_ig_j$ converges to an almost surely finite random variable due to the result I stated above.

Hence $\frac{1}{\sqrt{n}}\sum_{1 \leq i < j \leq n} a_{ij}g_ig_j\xrightarrow{a.s.}0$ easily follows.

For a more detailed reference to the Kolmogorov-Khintchine Criterion see Sidney Resnick a Probability Path Theorem 7.3.3