I am having trouble underestanding the proof of
Suppose $X_n \xrightarrow{\mathbb{P}} X$.
$X_n$ is uniformly integrable (u.i.) $\implies$ $X_n \xrightarrow{L^1} X$
The proof and the definitions that I am using are in the picture below. I don't understand the omitted details on how these first 3 inequalities were derived: $\mathbb{E}|\varphi_K(X_n)-X_n|\le \frac{\varepsilon}{3}\tag{1}$, $\mathbb{E}|\varphi_K(X)-X|\le \frac{\varepsilon}{3}\tag{2}$ and $|\varphi_K(x)-\varphi_K(y)|\le |x-y|\tag{3}$
This is my try for (1):
To derive (1) as said in the text we need to use the uniform integrability,so I use def (5.1): $\lim_{c \to \infty}\sup_n\mathbb{E}(|X_n|;|X_n|\ge c)=0$, which by using the definition of limit at infinity means that
$\forall \varepsilon >0 \exists M >0$ such that for $ K>M $,$ \mathbb{E}(|X_n|;|X_n|\ge K)\le \sup_n\mathbb{E}(|X_n|;|X_n|\ge K)< \varepsilon/3 \tag{*}$
$\mathbb{E}|\varphi_K(X_n)-X_n| $
$= \mathbb{E}(|\varphi_K(X_n)-X_n|1_{|X_n|> K})+\mathbb{E}(|\varphi_K(X_n)-X_n|1_{|X_n|\le K}) $
$= \mathbb{E}(|\mp K -X_n|1_{|X_n|> K})+\mathbb{E}(|X_n-X_n|1_{|X_n|\le K})$
$\le \mathbb{E}(K 1_{|X_n|> K}) + \mathbb{E}(|X_n|1_{|X_n|> K})$
$= K \mathbb{P}(|X_n|> K) + \mathbb{E}(|X_n|1_{|X_n|> K})$
$= K \mathbb{P}(|X_n|> K) + \mathbb{E}(|X_n|;|X_n|> K)$
and this is where I am stuck it would be nice if that "$K \mathbb{P}(|X_n|> K)$" weren't there and the second term should be with $\ge$ to use (*)(I guess that would not matter for absolutely continuous random variables, but the proof is supposed to be general , so what if the $X_n$ are discrete? Then the expectations are sums and it should make a difference, shouldn't it? ) How do I proceed from here? and how do I derive (2) and (3)? In (2), I get to a similar problem upon applying equation (5.1) ,and I am clueless about (3)
The definitions or results that I am using:




For (1), I think you want $\varepsilon/6$ instead of $\varepsilon/3$ in your equation (*). Then one has \begin{align*} \mathbb{E}[K1_{|X_n| > K}] + \mathbb{E}[|X_n| 1_{|X_n| > K}] &\le \mathbb{E}[|X_n|1_{|X_n| > K}] + \mathbb{E}[|X_n| 1_{|X_n| > K}] \\ &= 2 \mathbb{E}[|X_n| 1_{|X_n| > K}] \\ &\le 2 \mathbb{E}[|X_n| 1_{|X_n| \ge K}] \\ &< \frac{\varepsilon}{3}. \end{align*}
For (2), you can use the same idea along with the fact that $\mathbb{E}[|X|1_{|X| > K}] \rightarrow 0$ as $K \rightarrow \infty$.
For (3), this just comes from inspecting the function on a case-by-case basis. If $|x|, |y| \le K$, then $|\phi_K(x)-\phi_K(y)| = |x-y|$. If $x > K \ge |y|,$ $|\phi_K(x)-\phi_K(y)| = |K-y| \le |x-y|$. You can continue this through the rest of the cases.