I have read the statement of Law of iterated logarithm is like for some random variable $Y_i$ $$\lim\sup_{n\rightarrow\infty}\frac{\sum_{i=1}^nY_i}{\sqrt{2\log\log n}}=1\ a.s.$$
But I have also found another version using big O notation like $$\sum_{i=1}^nY_i=O((\log\log n)^{1/2})$$
Can someone tell me the relation between these two equations? It's for me not easy to understand from Definitions, since there is no limsup in Definition for big O
Thanks a lot for any hints
The Landau notation $f(n) = O(g(n))$ as $n\to \infty$ means that $\limsup_{n\to\infty} f(n)/g(n) < \infty.$ If we define $f(n) = \sum_{i \le n} Y_i$, and $g(n) = \sqrt{\log\log n},$ then the first statement is precisely giving a bound on $\limsup f(n)/g(n),$ and so proves the second statement (up to measure $0$). Of course, the second statement is as such weaker, since it's only claiming the existence of a constant $C$ such that $f(n) \le C g(n)$ for large $n,$ while the first statement is saying that $C = \sqrt{2}$ works.