Let $S_n \sim N(0,n)$ so that $S_n$ is the partial sum process of standard normal random variables.
My objective is to prove:
$$\sum_{n=1}^\infty P(S_n \geq (1+\epsilon)\sqrt{2 n \log log n }) < \infty$$
for all $\epsilon > 0$.
I cannot use the law of iterated logarithm, as the reason I want to prove the above inequality is to prove a weaker form of iterated logarithm ($\leq 1$ instead of $=1$).
My attempt has been to bound the tail probabilities using known inequalities for standard normal. Firstly, we rewrite it as
$$\sum_{n=1}^\infty P(S_n/\sqrt{n} \geq (1+\epsilon)\sqrt{2 \log \log n }) < \infty$$
$$\sum_{n=1}^\infty (1-\Phi((1+\epsilon)\sqrt{2 \log \log n })) < \infty$$
Inequalities I know of:
$$1-\Phi(x) \leq x^{-1} e^{-x^2/2}/\sqrt{2\pi}$$
$$1-\Phi(x) \leq \frac{1}{2} e^{-x^2/2}$$
But neither is enough to bound the sum by a convergent sequence. Alternatively, I also know that
$1-\Phi(x) \sim \phi(x)$ but that doesn't seem to help
(Too long for a comment)
This cannot be shown in the way you're trying. Indeed, the best possible this way is $\Omega(\sqrt{\log n})$. This is because a commensurate lower bound to the gaussian CDF exists:
$$1 - \Phi(x) \ge \frac{1}{\sqrt{2\pi}} \frac{x}{x^2 + 1} e^{-x^2/2}$$
(to show this, integrate by parts one more time in the proof of the first upper bound you state).
One reason you are failing is the following: with the proof strategy, the following statement would also follow: Consider independent Gaussians $Z_n \sim \mathcal{N}(0,1).$ Then $\{Z_n \ge (1+\epsilon)\sqrt{2\log \log n} \}$ occurs only finitely often a.s. It is intuitive that this is false (for a proof, use the lower bound above and B-C). The LIL truly uses the fact that one is dealing with a sum of Gaussians, by exploting the correlations between $S_n$ and $S_{(1+c)n}$ for small $c$.