S.N. Bernstein Law of Large Numbers

843 Views Asked by At

while reading a paper named "Network Embedding as Matrix Factorization: UnifyingDeepWalk, LINE, PTE, and node2vec" (http://keg.cs.tsinghua.edu.cn/jietang/publications/WSDM18-Qiu-et-al-NetMF-network-embedding.pdf), the authors give in the Appendix a lemma which is stated as follows:

(S.N. Bernstein Law of Large Numbers) Let $Y_1,Y_2,\ldots$ be a sequence of random variables with finite expectation $E(Y_j)<\infty$ (which, according to my understanding, is invariant for different $Y_i$), and uniformly bounded variance $Var(Y_j)<K<\infty$, $j\geq 1$, and covariances are s.t. $$ Cov(Y_i, Y_j)\rightarrow 0, |i-j|\rightarrow\infty $$ Then the law of large number holds.

However, I failed to find a proof, or a similar statement of Bernstein's Law of Large Numbers after I tried to google it. The authors cited for this lemma Problems in Probabilities by Albert N. Shiryaev, which I believe is an exercise book.

The interesting fact is that the statement does not assume that random variables are independent.

Would anyone tell me where I can find the source of the theorem and the proof? Thanks!

I hope this question is appropriate; if not, please tell me and I'll remove it.

1

There are 1 best solutions below

0
On

Well.. I think I'm able to prove it myself, after looking into it.

Write $S_n=\sum_{i=1}^n Y_i$, and $E(Y_i)=\mu$, we have $$ P(|S_n/n-\mu|>\epsilon)\leq \frac{Var(S_n/n)}{\epsilon^2} $$

And \begin{eqnarray} Var(S_n/n) = \frac{1}{n^2}Var(S_n) = \frac{1}{n^2}\sum_{i=1}^n\sum_{j=1}^n Cov(Y_i, Y_j)\leq \frac{1}{n^2}(nK+\sum_{i\neq j} Cov(Y_i,Y_j)) \end{eqnarray}

Using the condition, suppose that $|i-j|>M$ implies $Cov(Y_i, Y_j)<\lambda$, where $\lambda$ can be taken arbitrarily small, and suppose that $n>M$, we have \begin{eqnarray} \frac{1}{n^2}(nK+\sum_{i\neq j} Cov(Y_i,Y_j)) & = & \frac{1}{n^2}(nK+\sum_{|i-j|\leq M, i\neq j}Cov(Y_i,Y_j) + \sum_{|i-j|>M} Cov(Y_i,Y_j)) \\ &\leq &\frac{1}{n^2}(nK+\sum_{|i-j|\leq M, i\neq j}Cov(Y_i, Y_j) + n^2\lambda) \\ &= & \frac{1}{n^2}(nK+ n^2\lambda + \sum_{i=1}^n\sum_{j=i+1}^{i+M} Cov(Y_i, Y_j) ) \\ & \leq & \frac{1}{n^2}(nK + n^2\lambda + n(M-1)K) \end{eqnarray}

For sufficiently large $M$, choose $N>M$ such that when $n>N$, the last term is controlled arbitrarily small; then we see that it goes to zero.

Conclusion: always try solve the problem before asking or googling about it.