Optimal rate of growth of i.i.d. Gaussians?

105 Views Asked by At

Suppose I have a countable collection $\{N_k\}_{k=1}^\infty$ of independent $\mathcal{N}(0.1)$ random variables and I want to estimate their rate of growth in the sense that I want to find a function $f:\mathcal{N}\to\mathcal{R}$ such that $$\sup_k \frac{\vert N_k\vert}{f(k)} <\infty \ \text{ a.s.}$$ By means of the basic inequality $$\mathbb{P} (\vert N_k\vert \geq c)\leq 2e^{-c^2/2}$$ and Borel-Cantelli lemma, I can obtain that this works for $f(k)=\sqrt{2\log(1+k)}$ (the constant 2 is there only because it simplifies calculations) and that such an $f$ is sufficiently optimal in the sense that $\alpha=1/2$ is the threshold parameter for which this works with $f$ of the form $(\log(1+k))^\alpha$. The problem is that, if I define the variable $$Y:= \sup_k \frac{\vert N_k\vert}{\sqrt{2\log(1+k)}}$$ then apparently $Y$ is heavy tailed distributed, in the sense that the estimate I'm able to obtain again by the above inequality is $$\mathbb{P}(Y\geq y)\leq \frac{C}{1+y^2}$$ for a suitable constant $C$. So the first thing I'm asking is whether this estimate is quite sharp, and $Y$ is indeed heavy-tailed, or there are better estimates which lead to $Y$ having some finite moments. The second question is, if this is not the case, if one defines instead $$Y:= \sup_k \frac{\vert N_k\vert}{g(k)\sqrt{2\log(1+k)}}$$ for a suitable $g$ such that $g(k)\to\infty$ as $k\to\infty$, if there is some kind of optimal $g$ that grows as slowly as possible but such that $Y$ now admits moments of all orders. My intuition tells me that this should work for instance with $g(k)=\sqrt{\log(\log(1+k))}$ but I'm not able to perform nice calculations and I don't know if there is something even better, like $g$ even slower such that $Y$ still admits all moments.