I have the following problem from my book about Markov processes:
Let $\{Y_n:n\geq 1\}$ be a sequence of mutually independent, identically distributed random variables satisfying $E[Y_1]<\infty$. Set $X_n=\sum_{m=1}^n Y_m$ for $n\geq 1$. The Weak Law of Large Numbers says that
$$P\left(\left|\frac{X_n}{n}-E[Y_1]\right|\geq \epsilon\right)\rightarrow 0\;\;\;\text{for all } \epsilon>0.$$
In fact,
$$\lim_{n\rightarrow \infty} E\left[\left|\frac{X_n}{n}-E[Y_1]\right|\right]=0,\;\;\;\;\;\;(1.3.3)$$
from which the above follows as an application of Markov's inequality. Here are steps which lead to (1.3.3).
(a) First reduce to the case when $E[Y_1]=0$. Next, asume that $E[Y_1^2]<\infty$, and show that
$$E\left[\left|\frac{X_n}{n}\right|\right]^2 \leq E\left[\left|\frac{X_n}{n}\right|^2\right]=\frac{E[Y_1^2]}{n}.$$
Hence the result is proved when $Y_1$ has a finite second moment.
(b) Given $R>0$, set $Y_n^{(R)}=Y_n\textbf{1}_{[0,R)}(|Y_n|)-E[Y_n, |Y_n|<R]$ and $X_n^{(R)}=\sum_{m=1}^nY_m^{(R)}$ Note that, for any $R>0,$
$$E\left[\left|\frac{X_n}{n}\right|\right]\leq E\left[\left|\frac{X_n^{(R)}}{n}\right|\right]+E\left[\left|\frac{X_n-X_n^{(R)}}{n}\right|\right]\leq\sqrt{E\left[\left(\frac{X_n^{(R)}}{n}\right)^2\right]}+2E[|Y_1|, |Y_1|\geq R]\leq \frac{R}{\sqrt{n}}+2E[|Y_1|, |Y_1|\geq R],$$
and use this, together with the Monotone Convergence Theorem, to complete the proof of $(1.3.3)$
I have succesfully solved part (a) but I'm stuck now in part (b). My question therefore is: how do I solve part (b)?
Assuming you have had the long inequality in the hint of (b):
take the $\limsup$ on both sides gives $$ \limsup_n E\left[\left|\frac{X_n}{n}\right|\right]\leq 2E[|Y_1|, |Y_1|\geq R]\tag{1} $$ On the other hand, (1) is true for all $R>0$, and $$ E[|Y_1|, |Y_1|\geq R]\to 0,\quad\hbox{as }R\to\infty $$ by the monotone convergence theorem.