Suppose, $\{X_i:i\geq 1\}$ is a sequence of iid random variables with $E(X_i) = 0$ and $E X^2_i<\infty$. Let $T_n = \max_{1\leq i\leq n} |X_i|$. I have two questions.
(i) Define, $$ a_n\equiv E\big(n^{-1/2}T_n\big),\quad n\geq 1. $$ Is it possible that $a_n\rightarrow 0$, without any additional moment assumptions?
(ii) Is it possible to get an upper bound on the growth rate of $a_n$ of the sort of, $a_n\ll n^{-1/2}$ or some other rate.
If I assume higher moments, then I can do the proof. But, I need some help regarding how to handle this in case only second moments exist.
Thanks.
Here is a sketch. Let $F$ be distribution function of $|X_i|$ and let $G=1-F$ (hence $G(0)=1$ and $G$ is decreasing towards $0$). Finite second moment means that $\int_0^\infty 2tG(t)dt<\infty$ which implies $G(t)=o(\frac 1 {t^2})$. Let $t_0=0$ and for $k\geq 1$, choose $t_k$ so that $G(t_k)=\frac 1 {kn}$. Then
$$E(T_n)=\int_0^{\infty}1-(1-G(t))^ndt=\sum_{k=0}^{\infty}\int_{t_k}^{t_{k+1}}1-(1-G(t))^ndt\leq t_1+\sum_{k=1}^{\infty}\int_{t_k}^{t_{k+1}}1-(1-\frac 1 {nk})^ndt\leq t_1+\sum_{k=1}^{\infty}(t_{k+1}-t_k)\frac 1 k$$
We know that $t_k=o(\sqrt {nk})$, hence $t_{k+1}-t_k=o(\sqrt\frac n k)$ so the above sum converges to $o(\sqrt n)$. This shows that we don't even need a finite second moment - we just need $G(t)=o(\frac 1 {t^2}).$