Geometric distribution and Borel Cantelli lemma

245 Views Asked by At

Let $X_1,X_2, \ldots$ r.v i.i.d. with geometric distribution and parameter $p = 1 - e^{-1}$, i.e. $P(X_1 = k) = p(1 -p)^k;$ $k = 0, 1, 2, \ldots$. Prove that $$P\left[\limsup_{n \to \infty}\frac{X_n}{\log(n)}=1\right]=1.$$

I'm stuck with this exercise. I think I have to use Borel Cantelli Lemma. Since r.v. are independent I need to prove that

$$\sum_{n=1}^{\infty}P\left[\frac{X_n}{\log(n)}=1\right]=\infty,$$ then \begin{align*} \sum_{n=1}^{\infty}P\left[\frac{X_n}{\log(n)}=1 \right]&=\sum_{n=1}^{\infty}P\left[X_n=\log(n)\right]\\ &=\sum_{n=1}^{\infty} p(1 -p)^{\log(n)}\\ &=p\sum_{n=1}^{\infty} (1 -1+e^{-1})^{\log(n)}\\ &=p\sum_{n=1}^{\infty} (e^{-1})^{\log(n)}\\ &=p\sum_{n=1}^{\infty} \left(\frac{1}{e}\right)^{\log(n)}\\ \end{align*}

I know $\log(n)$ is not always an integer so my procedure is definitely wrong but I do not know how to begin... any help?

2

There are 2 best solutions below

4
On BEST ANSWER

In order to show $P(\limsup_n Y_n=1)=1$ for a sequence of random variables $Y_1,Y_2,\dots,$ you always need to show two things:

  1. For all $\newcommand\e{\varepsilon}\e>0$, $P(Y_n>1+\e\;\text{ i.o.})=0$.

  2. For all $\e>0$, $P(Y_n>1-\e\;\text{ i.o.})=1$.

This is because for a deterministic sequence $a_1,a_2,\dots$, we have $\limsup_n a_n=L$ if and only if infinitely many $a_n$ are larger than $L-\e$ for all $\e>0$, while only finitely many $n$ are larger than $L+\e$ for any $\e>0$. These are just undergraduate analysis facts, so I omit proving them.

The Borel-Cantelli lemmas suggest we should look at the convergence of $\sum_n P(X_n/\log n> 1\pm \e)$ to determine points $(1)$ and $(2)$. Using the fact that $P(X_1>n)=(1-p)^n$ for any integer $n$ (which I will apply to some non-integer $n$, and not worry about the details), we have $$ \sum_{n=2}^\infty P\left(\frac{X_n}{\log n}>1\pm \e\right)=\sum_{n=2}^\infty P(X_n>(1\pm \e)\log n)=\sum_{n=2}^\infty (1-p)^{(1\pm \e)\log n}=\sum_{n=2}^\infty n^{(1\pm\epsilon)\log(1-p)} $$ In the last step, I used the property that $a^{\log b}=b^{\log a}$. After substituting $p=1-e^{-1}$, you can determine that this series is indeed convergent for $+\e$ and divergent for $-\e$, which allows you to conclude $(1)$ and $(2)$.

2
On

Let $\varepsilon > 0.$ Consider the event $\mathrm{E}_{n, \varepsilon} = \{X_n > (1 + \varepsilon) \log n\}.$ We know that for integers $t,$ $\mathbf{P}(X_1 \geq t) = (1 - p)^t.$ I will let you check that for all $t > 1,$ $\mathbf{P}(X_1 \geq t) \asymp (1 -p)^t,$ where $x \asymp y$ means $ax \leq y \leq bx$ for two constants $a, b > 0$ that are "universal" (in this case, they do not depend on $t$ but they can depend on $p$).

Let us assume, for the sake of conclusion, that we have established that $\mathbf{P}(\mathrm{E}_{n, \varepsilon})$ defines a summable series (in $n$) for every $\varepsilon > 0.$ Then, $\left\{\mathrm{E}_{n, \varepsilon}\ \mathrm{i.o.}\right\}$ has probability zero for every $\varepsilon$ and therefore $\mathrm{U} = \bigcup\limits_{m = 1}^\infty \left\{\mathrm{E}_{n, m^{-1}}\ \mathrm{i.o.}\right\}$ also has probability zero. The event $\mathrm{U}$ is characterised by $\mathrm{U}^\complement = \left\{ \limsup\limits_{n \to \infty} \dfrac{X_n}{n} \leq 1 \right\}.$ A symmetric argument using $\mathrm{E}_{n, -\varepsilon}$ (and using that the $X_n$ are independent) shows that that we only need to show that $\mathbf{P}(\mathrm{E}_{n, -\varepsilon})$ defines a divergent series for every $\varepsilon > 0.$

Now, it easily follows that $\mathbf{P}(\mathrm{E}_{n, \pm \varepsilon}) \asymp (1 - p)^{(1 \pm \varepsilon) \log n}.$ By virtue of the Condensation Test, $$ \sum\limits_n \mathbf{P}(\mathrm{E}_{n, \pm \varepsilon}) \asymp \sum_k 2^k \mathbf{P}(\mathrm{E}_{2^k, \pm \varepsilon}) \asymp \sum_k 2^k (1 - p)^{\big((1 \pm \varepsilon) \log 2 \big) k} = \sum_k 2^k 2^{-(1 \pm \varepsilon) k} $$ where the last equality follows from $1 - p = e^{-1}.$ Simplying the summands, yields $2^k 2^{-(1 \pm \varepsilon) k} = 2^{\pm \varepsilon k}$ and the series converges or diverges accordingly to the sign being positive or negative. Q.E.D.