I am working on the converse of Law of Iterated Logarithm by following Exercise 1.5.12 in Stroock's "Probability Theory: an Analytic View". I am stuck on the following step in the proof :
Let $ \{X_n\}_{n=1}^{ \infty } $ be a sequence of independent identically distributed real random variables defined on a common probability space $(\Omega, F, P)$. Put $S_n = \sum_{i=1}^nX_i$. We assume that:
\begin{equation*} P\{\limsup_{n \to \infty} \frac{|S_n|}{\sqrt{2n \ln (\ln n)}} < \infty \} > 0 \end{equation*}
The goal is to prove that $X_1 \in L^2$ (i.e. square integrable) and that:
\begin{equation*} P\{\limsup_{n \to \infty} \frac{S_n}{\sqrt{2n \ln (\ln n)}} = \sqrt{\mathbb{E}[X_1^2]} \} = 1 \end{equation*}
It turns out that we can assume without loss of generality that $X_1$ is symmetric (i.e. $X_1$ and $-X_1$ has the same distribution). Assume from now on that $X_1$ is symmetric. By Kolmogorov's 0-1 law, we can easily see that there exists an $\sigma \in [0, \infty)$ such that:
\begin{equation*} P\{\limsup_{n \to \infty} \frac{|S_n|}{\sqrt{2n \ln (\ln n)}} = \sigma \} = 1 \end{equation*}
Put, for each $t \in (0, \infty)$:
\begin{equation*} \check{X_n^t} = X_n 1_{[0,t]}(|X_n|) - X_n 1_{(t, \infty)}(|X_n|) \end{equation*}
We see that, by symmetry, $X_n$ has the same distribution as $\check{X_n^t}$ for evert $t$. Since:
\begin{equation*} X_n 1_{[0,t]}(|X_n|)=\frac{1}{2} (X_n + \check{X_n^t}) \end{equation*}
We see also that:
\begin{equation} P\{ \limsup_{n \to \infty} \frac{1}{\sqrt{2n \ln (\ln n)}} |\sum_{i=1}^nX_i 1_{[0,t]}(|X_i|)| \leqslant \sigma \} =1 \end{equation}
Up to now it has been very clear. But next I am asked to show that:
\begin{equation*} \mathbb{E}[X_1^2] = \lim_{t \to \infty} \mathbb{E}[X_1^2, |X_1| \leqslant t ] \leqslant \sigma^2 \end{equation*}
It is not clear for me how the second equality holds.
My attempt is to cite Determining the square of a $\lim \sup$ and the reverse Fatou's lemma for bounded random variables. By squaring the equation (1), we have, for any $t$, almost surely
\begin{equation} \sigma^2 \geqslant (\limsup_{n \to \infty} \frac{1}{\sqrt{2n \ln (\ln n)}} |\sum_{i=1}^nX_i 1_{[0,t]}(|X_i|)| )^2 = \limsup_{n \to \infty} \frac{1}{2n \ln (\ln n)} [\sum_{i=1}^n X_i^2 1_{[0,t]}(|X_i|) + 2 \sum_{i=1}^n \sum_{j=1}^{i-1} X_iX_j 1_{[0,t]}(|X_i|) 1_{[0,t]}(|X_j|) ] \end{equation}
So that, by taking expectations and apply reverse Fatou:
\begin{equation*} \sigma^2 \geqslant \lim_{n \to \infty} \frac{1}{2 \ln (\ln n)} \mathbb{E}[ X_i^2 1_{[0,t]}(|X_i|)] = 0 \end{equation*}
It seems that the estimation is not working out. It would work out if, for instance, I can interchange the limit of $n$ and $t$. But the term on the right is not, say, converging uniformly. I am stuck here.
Edit:
The problem is resolved. I just need to apply the standard law of iterated logarithm to $X_n 1_{[0,t]}(|X_n|)$.