This is an example from Durrett (Page 131, 5th edition, Durrett). Suppose $X_1,X_2,..$ are i.i.d. and have $P(X_1>x) = P(X_1<x)$ and $P(|X_1|>x)=x^{-2}$ for $x\geq 1.$ $E|X_1|^2 = \infty$ but $S_n = X_1+..X_n$ suitably normalized converges to a normal dist.
In the proof, I have one question.
Let $Y_{n,m} = X_m 1_{(|X_m|\leq n^{1/2}\log\log{n})}$ and $c_n = n^{1/2}\log\log{n}$. To show $EY_{n,m}^2 \geq \log{n}$, it says
$P(|Y_{n,m}|>x) = P(|X_1|>x)-P(|X_1|>c_n)$ $\geq (1-(\log\log{n})^{-2})P(|X_1|>x)$ when $x\leq \sqrt{n}$ which I don't understand why because that means $P(|X_1|>c_n) \leq (\log\log{n})^{-2}P(|X_1|>x)$ which I don't know why is true?
Thanks.
Assuming that $n$ is large enough in order to ensure that $c_n\geq 1$, we have $$ \Pr(\lvert X_1\rvert>c_n)=c_n^{-2}= \frac 1n \frac 1{(\log\log n)^2} $$ and since $x\leq \sqrt n$, then $n\geq x^2$ hence $$ \Pr(\lvert X_1\rvert>c_n) \leq \frac 1{x^2} \frac 1{(\log\log n)^2} $$ which is exactly what we want.