Compare expectation of probability to the infinite sum of probability

414 Views Asked by At

This question is from the proof of Durrett Theorem 2.3.8 and my post here: Using the first and second Borel-Cantelli Lemma to find necessary and sufficient condition for convergence in probability ($98\%$ solved)

In the proof of Durrett, he used this inequality below:

$$E|X_{1}|=\int_{0}^{\infty}P(|X_{1}|>x)dx\leq\sum_{n=0}^{\infty}P(|X_{1}|>n).$$

I have no problem with the first equality, but I cannot find any reference in the book of the second inequality. How could I compare the integral with an infinite sum of the probability?

Also, in my post linked above, it seems like we used a inequality as follows:

$$\sum_{n=1}^{\infty}P(|X_{1}|\geq \epsilon n)\leq\dfrac{1}{\epsilon}E|X_{1}|=\dfrac{1}{\epsilon}\int_{0}^{\infty}P(X_{1}>x)dx.$$

What inequality are we using here?

I am really confused here....

Edit 1:

Okay I figured it out.

Firstly, as "Math1000" suggested, set $f(x):=P(|X_{1}|>x)\in[0,1]$, note that $f(x)=1-F_{X_{1}}(x)$, and $F_{X_{1}}(x)\longrightarrow 1$ as $x\rightarrow\infty$, and thus $f(x)\longrightarrow 0$ as $x\rightarrow\infty$, therefore $f(x)$ is decreasing. So if you draw a graph and note that the sum starts at 0, not 1, the inequality follows immediately.

Also, use the same idea, $F_{X_{1}}(x)=P(|X_{1}|\geq n)$ is increasing, so we have $$\int_{0}^{\infty}P(|X_{1}|\geq x)dx\geq\sum_{n=0}^{\infty}P(|X_{1}|\geq n).$$

The point here is that we sum at the beginning at $n=0$, if starting at $n=1$, the story is complete opposite.

Secondly, the inequality in the second yellow block is wrong, or at least I don't need this inequality. In the post linked, I tried to show $$P(|X_{1}|\geq \epsilon n)<\infty$$ to use Borel Cantelli Lemma, with the assumption $E|X_{1}|<\infty$, but I don't really need to connect to $E|X_{1}|$, since $X_{1}\in L_{1}\subset L_{p}$ for all $1<p\leq\infty$, we can connect it to $E|X_{1}|^{2}<\infty$.

In fact I don't need to include epsilon. We use chebyshev, or Markov, we would have for each $n$, $$P(|X_{1}|\geq n)\leq\dfrac{E|X_{1}|^{2}}{n^{2}},$$ and thus $$\sum_{n=1}^{\infty}P(|X_{1}|\geq n)\leq E|X_{1}|^{2}\sum_{n=1}^{\infty}\dfrac{1}{n^{2}}<\infty,$$ by $E|X_{1}|^{2}<\infty$ and the p-series test.

I will keep the discussion open for the next two days, so please do not hesitate to point out any mistakes I have :) I will answer the question myself if the discussion ends.

2

There are 2 best solutions below

1
On BEST ANSWER

Combining $$ \int_{0}^{\infty}\mathsf{P}(|X_{1}|>x)\,dx=\sum_{n\ge 0}\int_{n}^{n+1}\mathsf{P}(|X_{1}|>x)\,dx $$ and $$ \mathsf{P}(|X_1|>n+1)\le\int_{n}^{n+1}\mathsf{P}(|X_{1}|>x)\,dx\le \mathsf{P}(|X_1|>n), $$ one gets $$ \sum_{n\ge 1}\mathsf{P}(|X_{1}|>n)\le\int_{0}^{\infty}\mathsf{P}(|X_{1}|>x)\,dx\le \sum_{n\ge 0}\mathsf{P}(|X_{1}|>n). $$ Using the first inequality, $$ \sum_{n\ge 1}\mathsf{P}(\epsilon^{-1}|X_{1}|> n)\le \int_0^{\infty}\mathsf{P}(\epsilon^{-1}|X_1|>x)\,dx=\frac{1}{\epsilon}\mathsf{E}|X_1|. $$

0
On

Okay I am now going to answer my own post. Note that in my edit, the first part is correct but second one is not, since I remembered the inclusion of $L_{p}$ space in an opposite direction. It is actually that for finite measure space, and $1\leq p<q\leq\infty$, we have $L_{q}\subset L_{p}$, so if a function is $L_{1}$, it is not necessarily $L_{2}$. Thus, we need another way.

For the first:

As "Math1000" suggested, set $f(x):=P(|X_{1}|>x)\in[0,1]$, note that $f(x)=1-F_{X_{1}}(x)$, and $F_{X_{1}}(x)\longrightarrow 1$ as $x\rightarrow\infty$, and thus $f(x)\longrightarrow 0$ as $x\rightarrow\infty$, therefore $f(x)$ is decreasing. So if you draw a graph and note that the sum starts at 0, not 1, the inequality follows immediately. In fact, the inequity I suggested in the second yellow box is correct and I do need it. I will prove in this answer.

Also, use the same idea, $F_{X_{1}}(x)=P(|X_{1}|\geq n)$ is increasing, so we have $$\int_{0}^{\infty}P(|X_{1}|\geq x)dx\geq\sum_{n=0}^{\infty}P(|X_{1}|\geq n).$$

The point here is that we sum at the beginning at $n=0$, if starting at $n=1$, the story is complete opposite.

For the second:

Lemma: Let $X$ be a non-negative random variable. Show that $Y:=[X]$ satisfies $Y=\sum_{n=1}^{\infty}\mathbb{1}_{(X\geq n)}$ and deduce that $$E(X)-1\leq \sum_{n=1}^{\infty}P(X\geq n)\leq E(X).$$

Proof of Lemma:

By definition, $Y$ is the integer part of the non-negative random variable. Thus, $Y$ is a non-negative integer-valued random variable. Thus, $$Y=\sum_{k=1}^{\infty}\mathbb{1}_{(Y\geq k)}.$$

There is nothing too much to argue here, but an example can be used to illustrate this. Consider a two-player game in which I draw $Y$ and you ask me the following questions:

1) If $Y\geq 1$? and I answer yes or no;

2) If $Y\geq 2$? and I answer yes or no;

3) If $Y\geq 3$? and I answer yes or no;

4) so on...

Notice that if $Y=3$, then I will answer yes three times, so in general if $Y=i$, I will answer yes $i$ times, and thus $Y=\mathbb{1}_{\{Y\geq 1\}}+\mathbb{1}_{\{Y\geq 2\}}+\mathbb{1}_{\{Y\geq 3\}}+\cdots.$

Now, by definition, we have $Y\leq X$, and thus for $k\in\mathbb{Z}_{\geq 1}$, it is immediate that $$\{\omega:Y(\omega)\geq k\}\subset\{\omega:X(\omega)\geq k\}.$$

Conversely, let $\omega\in\{\omega:X(\omega)\geq k\}$. Then, we have the following different cases:

1) If $X(\omega)=k$, then $Y(\omega)=k$;

2) If $X(\omega)>k$ but $X(\omega)-k<1$, then $Y(\omega)=k$.

3) If $X(\omega)>k$ but $X(\omega)-k\geq 1$, then $Y(\omega)\geq k+1>k$.

In all these three cases, we have $\omega\in \{\omega:Y(\omega)\geq k\}$, and thus $$\{\omega:Y(\omega)\geq k\}\supset\{\omega:X(\omega)\geq k\},$$ so that $$\{\omega:Y(\omega)\geq k\}=\{\omega:X(\omega)\geq k\}.$$

Then, it is immediate that $$Y=\sum_{k=1}^{\infty}\mathbb{1}_{(Y\geq k)}=\sum_{k=1}^{\infty}\mathbb{1}_{(X\geq k)}, \ \text{as desired.}$$

Finally, again by definition, $X-1\leq Y\leq X$, so if we take expectation and recall what we just proved, we have $$E(X)-1\leq\sum_{k=1}^{\infty}P(X\geq k)\leq E(X),\ \text{as desired.}$$

So the yellow box follows immediately from the lemma by replace $X:=|X_{1}|/\epsilon.$