Solution verification on a Borel Cantelli Exercise

70 Views Asked by At

I have a question regarding the following exercise that I just solved.

Let $(X_n)$ be any sequence of random variables, show that there exists a sequence of positive constants $(l_n)$ such that one has $$ P(\lim_{n \to \infty} \frac{X_n}{l_n} = 0). $$

Solution: Getting started, there exists a sequence of $(l_n)$ such that for every $n \in \mathbb N$ one has $P(|l_n X_n| \geq 1/n) \leq 1/n^2$. Indeed applying the Markov inequality one has $$ P(|l_n X_n| \geq 1/n) \leq l_n E[|X_n|] n \leq 1/n^2, $$ if one chooses $l_n \leq \frac{1}{E[X_n] n^3}$. Now to finish the proof it sufficies to show that for every $\epsilon > 0$ one has $$ P(\limsup_{n \to \infty} \frac{|X_n|}{l_n} \geq \epsilon) = 0. $$ We are going to apply a Borel-Cantelli argument, for this observe that for every $\epsilon > 0$ there exists some $n \in \mathbb N$ such that one has $\epsilon > 1/n$, applying this one observation in combination with the prior estimate yields that $$ \sum_{n \in \mathbb N} P(\frac{X_n}{l_n} \geq \epsilon) \leq \sum_{n \in \mathbb N} P(\frac{X_n}{l_n} \geq 1/n) \leq \sum_{n \in \mathbb N} 1/n^2 < \infty, $$ Hence by the first Borel Cantelli lemma one has that $$ P(\limsup_{n \to \infty} |X_n| /l_n \geq \epsilon) = 0, $$ and this sufficies.

Question: Is this reasoning, especially the choice of $(l_n)$ correct? What if one takes $X_n = + \infty$ for all $n \in \mathbb N$ as this should be allowed for random variables.

1

There are 1 best solutions below

6
On BEST ANSWER

There are two (slight) mistakes. Also, you need to state that $X_{n}$'s are integrable if you are planning to use Markov's Inequality.

Firstly, you fix $\epsilon$ and then say

$$ \sum_{n \in \mathbb N} P(\frac{X_n}{l_n} \geq \epsilon) \leq \sum_{n \in \mathbb N} P(\frac{X_n}{l_n} \geq 1/n) \leq \sum_{n \in > \mathbb N} 1/n^2 < \infty, $$

No. This is not true.

What you need to do is say that there exists $N_{0}$ such that $\frac{1}{n}<\epsilon$ for all $n> N_{0}$

Hence $P(|\frac{X_{n}}{l_{n}}|\geq\epsilon)\leq P(|\frac{X_{n}}{l_{n}}|\geq\frac{1}{n})\leq \frac{1}{n^{2}}$ for all $n>N_{0}$.

Hence \begin{align} \sum_{n \in \mathbb N} P(|\frac{X_n}{l_n}| \geq \epsilon)&=\sum_{k=1}^{N_{0}}P(|\frac{X_n}{l_n}| \geq \epsilon)+\sum_{k=N_{0}+1}^{\infty}P(|\frac{X_n}{l_n}| \geq \epsilon)\\ &\leq \sum_{k=1}^{N_{0}}P(|\frac{X_n}{l_n}| \geq \epsilon)+\sum_{n\geq N_{0}+1}\frac{1}{n^{2}}<\infty \end{align}

And this is true for all $\epsilon>0$.

Now you set $B_{\epsilon}=\{|\frac{X_{n}}{l_{n}}|\geq\epsilon\,\text{infinitely often}\}=\{\limsup_{n \to \infty} |X_n| /l_n \geq \epsilon\}$

Now you take $B=\bigcup_{n\geq 1}B_{\frac{1}{n}}$ and by what we(you) did, $P(B)=0$ or equivalently $P(B^{c})=1$

Now show that for any $\omega\notin B$ or equivalently for any $\omega\in B^{c}$ which is a set of probability $1$, you have that $\frac{X_{k}(\omega)}{l_{k}}\xrightarrow{k\to\infty}0$.

Note that the last step that I did above is basically showing this useful reslt: If $X_{n},X$ be random variables such that $\displaystyle\sum_{n\in\Bbb{N}}P(|X_{n}-X|>\epsilon)<\infty$ for each $\epsilon$ then $X_{n}\xrightarrow{a.s.}X$.