This question is from the proof of Durrett Theorem 2.3.8 and my post here: Using the first and second Borel-Cantelli Lemma to find necessary and sufficient condition for convergence in probability ($98\%$ solved)
In the proof of Durrett, he used this inequality below:
$$E|X_{1}|=\int_{0}^{\infty}P(|X_{1}|>x)dx\leq\sum_{n=0}^{\infty}P(|X_{1}|>n).$$
I have no problem with the first equality, but I cannot find any reference in the book of the second inequality. How could I compare the integral with an infinite sum of the probability?
Also, in my post linked above, it seems like we used a inequality as follows:
$$\sum_{n=1}^{\infty}P(|X_{1}|\geq \epsilon n)\leq\dfrac{1}{\epsilon}E|X_{1}|=\dfrac{1}{\epsilon}\int_{0}^{\infty}P(X_{1}>x)dx.$$
What inequality are we using here?
I am really confused here....
Edit 1:
Okay I figured it out.
Firstly, as "Math1000" suggested, set $f(x):=P(|X_{1}|>x)\in[0,1]$, note that $f(x)=1-F_{X_{1}}(x)$, and $F_{X_{1}}(x)\longrightarrow 1$ as $x\rightarrow\infty$, and thus $f(x)\longrightarrow 0$ as $x\rightarrow\infty$, therefore $f(x)$ is decreasing. So if you draw a graph and note that the sum starts at 0, not 1, the inequality follows immediately.
Also, use the same idea, $F_{X_{1}}(x)=P(|X_{1}|\geq n)$ is increasing, so we have $$\int_{0}^{\infty}P(|X_{1}|\geq x)dx\geq\sum_{n=0}^{\infty}P(|X_{1}|\geq n).$$
The point here is that we sum at the beginning at $n=0$, if starting at $n=1$, the story is complete opposite.
Secondly, the inequality in the second yellow block is wrong, or at least I don't need this inequality. In the post linked, I tried to show $$P(|X_{1}|\geq \epsilon n)<\infty$$ to use Borel Cantelli Lemma, with the assumption $E|X_{1}|<\infty$, but I don't really need to connect to $E|X_{1}|$, since $X_{1}\in L_{1}\subset L_{p}$ for all $1<p\leq\infty$, we can connect it to $E|X_{1}|^{2}<\infty$.
In fact I don't need to include epsilon. We use chebyshev, or Markov, we would have for each $n$, $$P(|X_{1}|\geq n)\leq\dfrac{E|X_{1}|^{2}}{n^{2}},$$ and thus $$\sum_{n=1}^{\infty}P(|X_{1}|\geq n)\leq E|X_{1}|^{2}\sum_{n=1}^{\infty}\dfrac{1}{n^{2}}<\infty,$$ by $E|X_{1}|^{2}<\infty$ and the p-series test.
I will keep the discussion open for the next two days, so please do not hesitate to point out any mistakes I have :) I will answer the question myself if the discussion ends.
Combining $$ \int_{0}^{\infty}\mathsf{P}(|X_{1}|>x)\,dx=\sum_{n\ge 0}\int_{n}^{n+1}\mathsf{P}(|X_{1}|>x)\,dx $$ and $$ \mathsf{P}(|X_1|>n+1)\le\int_{n}^{n+1}\mathsf{P}(|X_{1}|>x)\,dx\le \mathsf{P}(|X_1|>n), $$ one gets $$ \sum_{n\ge 1}\mathsf{P}(|X_{1}|>n)\le\int_{0}^{\infty}\mathsf{P}(|X_{1}|>x)\,dx\le \sum_{n\ge 0}\mathsf{P}(|X_{1}|>n). $$ Using the first inequality, $$ \sum_{n\ge 1}\mathsf{P}(\epsilon^{-1}|X_{1}|> n)\le \int_0^{\infty}\mathsf{P}(\epsilon^{-1}|X_1|>x)\,dx=\frac{1}{\epsilon}\mathsf{E}|X_1|. $$