Let $X_1, X_2, ...$ be iid r.v.'s with the following conditions: $\forall x \in \mathbb{R}$, $P(X_1 > x) = P(X_1 < -x)$; $xP(|X_1| > x) \to c$ as $x \to \infty$ for some positive and finite $c$. Prove that $$ \frac{\sum_{i = 1}^nX_i}{l_n} \to 0 $$ in probability for any $l_n$ such that $l_n/n \to \infty$ as $n \to \infty$.
So this is pretty much a Cauchy-like sequence. My immediate attempt is to truncate the sequence by setting $$X_i = Y_i + Z_i$$ where $$ Y_i = X_i\mathbf{1}\{|X_i| \leq i \}, \ Z_i = X_i\mathbf{1}\{|X_i| > i \}. $$ It is straightforward to prove that $$ \frac{\sum_{i = 1}^nY_i}{l_n} \to 0 $$ in probability. But how do I prove $$ \frac{\sum_{i = 1}^nZ_i}{l_n} \to 0 $$ in probability? I just can't find a way to bound the term $$ P\left(\left|\frac{\sum_{i = 1}^nZ_i}{l_n}\right| > \epsilon\right). $$ Any ideas? Thank you!
It suffices to prove that
$$S_n := \frac{X_1+\dots+X_n}{l_n}$$
converges in distribution to $0$. Using the fact that $X_i$ has symmetric distribution, its characteristic function can be written as $\varphi_{X_1}(t)=\mathbb{E}[\cos(tX)]$. Then by the inequality $1 - \cos x \leq \frac{1}{2}(2 \wedge x)^2$ and the Fubini-Tonelli's Theorem, we get
\begin{align*} \left| 1 - \varphi_{X_1}(t) \right| \leq \frac{1}{2} \mathbb{E}[ (2 \wedge \left| tX \right| )^2 ] = \mathbb{E}\biggl[ \int_{0}^{2} x \mathbf{1}_{\{ x < |tX|\}} \, \mathrm{d}x \biggr] = \int_{0}^{2} x \mathbb{P}( |X| > x/|t|) \, \mathrm{d}x. \end{align*}
Now the assumption tells that there exists a constant $C > 0$ satisfying $\mathbb{P}(|X| > x) \leq C/x$ for all $x > 0$, and so,
$$ \left| 1 - \varphi_{X_1}(t) \right| \leq 2C|t|. \tag{1} $$
Then for each fixed $t \in \mathbb{R}$,
$$ \varphi_{S_n}(t) = \varphi_{X_1}(t/l_n)^n = \bigl( 1 - \underbrace{(1-\varphi_{X_1}(t/l_n))}_{=\mathcal{O}(|t|/l_n)} \bigr)^n $$
and $n/l_n \to 0$ shows that $\varphi_{S_n}(t) \to 1$ as $n \to \infty$. Therefore $S_n \to 0$ in distribution by the Lévy's continuity theorem.
Remark. In this proof, only a weaker implication of the assumption $\mathbb{P}(X_1>x)\sim c/x$ was enough to establish the claim. With its full power, we can actually prove a stronger statement that
$$ \frac{X_1+\dots+X_n}{n} \xrightarrow[n\to\infty]{d} \frac{\pi c}{2} Z, $$
where $Z$ has the standard Cauchy distribution. This is also an instance of the generalized CLT.