Compute $\limsup\limits_{n\to\infty}\frac{X_n}{\log(n)}$ and $\liminf\limits_{n\to\infty}\frac{X_n}{\log(n)}$

99 Views Asked by At

Let $X_1,X_2,...$ be independent and $\sim \operatorname{Exp}(1)$. I have to compute $\limsup\limits_{n\to\infty}\frac{X_n}{\log(n)}$ and $\liminf\limits_{n\to\infty}\frac{X_n}{\log(n)}$.

I saw this question here. While I understand the reasoning, I don't get how you can "see" that $\limsup\limits_{n\to\infty}\frac{X_n}{\log(n)}$ will go to $1$. And thus, I have no idea what $\liminf\limits_{n\to\infty}\frac{X_n}{\log(n)}$ will be equal to. I feel like I have "nothing to work with", $X_n: \mathbb{R}_{\geq 0}\to \mathbb{R}$ is just some measurable function.

2

There are 2 best solutions below

0
On BEST ANSWER

1. $\varlimsup\limits_{n\to\infty}\frac{X_n}{\log(n)}=1$: On one hand: \begin{gather*} \mathsf{P}\Big(\frac{X_n}{\log(n)}>1+\frac{1}{k}\Big)=e^{-(1+1/k)\log(n)} =\frac{1}{n^{1+1/k}},\quad \forall n,k \in \mathbb{N}_+\\ \Downarrow\\ \sum_{n=1}^{\infty} \mathsf{P}\Big(\frac{X_n}{\log(n)}>1+\frac{1}{k}\Big)<\infty \\ \quad \hphantom{\text{Using Borel-Cantelli Lemma}}\Downarrow \quad \text{Using Borel-Cantelli Lemma}\\ \mathsf{P}\Big(\Big\{\frac{X_n}{\log(n)}>1+\frac{1}{k}\Big\}\text{ i.o. (for $n$ )} \Big)=0\\ \Downarrow\\ \mathsf{P}\Big(\varlimsup\limits_{n\to\infty}\frac{X_n}{\log(n)} \le 1+\frac{1}{k}\Big) =1, \quad \forall k \in \mathbb{N}_+\\ \Downarrow \\ \mathsf{P}\Big(\varlimsup\limits_{n\to\infty}\frac{X_n}{\log(n)} \le 1\Big) =1. \tag{1} \end{gather*} On other hand, \begin{gather*} \mathsf{P}\Big(\frac{X_n}{\log(n)}>1-\frac{1}{k}\Big) =\frac{1}{n^{1-1/k}},\quad \forall n,k \in \mathbb{N}_+ \\ \Downarrow \\ \sum_{n=1}^{\infty} \mathsf{P}\Big(\frac{X_n}{\log(n)}>1-\frac{1}{k}\Big)=+\infty \\ \quad \hphantom{\text{Using 2nd B-C Lemma}}\Downarrow \quad \text{Using 2nd B-C Lemma}\\ \mathsf{P}\Big(\Big\{\frac{X_n}{\log(n)}>1-\frac{1}{k}\Big\}\text{ i.o. (for $n$ )} \Big)=1\\ \Downarrow\\ \mathsf{P}\Big(\varlimsup\limits_{n\to\infty}\frac{X_n}{\log(n)} \ge 1-\frac{1}{k}\Big) =1, \quad \forall k \in \mathbb{N}_+\\ \Downarrow \\ \mathsf{P}\Big(\varlimsup\limits_{n\to\infty}\frac{X_n}{\log(n)} \ge 1\Big) =1. \tag{2} \end{gather*} Therefore, from (1),(2), \begin{equation*} \varlimsup\limits_{n\to\infty}\frac{X_n}{\log(n)}=1,\quad\text{a.s..} \end{equation*}

2. $\varliminf\limits_{n\to\infty}\frac{X_n}{\log(n)}=0$: On one hand, from $X_n\ge 0$, \begin{equation*} \mathsf{P}\Big(\varliminf\limits_{n\to\infty}\frac{X_n}{\log(n)} \ge 0\Big) =1. \tag{3} \end{equation*} On other hand, \begin{gather*} \mathsf{P}\Big(X_n<\frac{\log(n)}{k}\Big)= \int_{0}^{\log(n)/k} e^{-x}\,\mathrm{d}x \ge \frac{\log(n)}{k n^{1/k}}\ge \frac{\log(n)}{kn}, \quad \forall n,k \in \mathbb{N}_+\\ \Downarrow\\ \sum_{n=1}^{\infty}\mathsf{P}\Big(\frac{X_n}{\log(n)}<\frac{1}{k}\Big)=+\infty, \\ \quad \hphantom{\text{Using 2nd B-C Lemma}}\Downarrow \quad \text{Using 2nd B-C Lemma}\\ \mathsf{P}\Big(\Big\{\frac{X_n}{\log(n)}<\frac{1}{k}\Big\}\text{ i.o. (for $n$ )} \Big)=1 \\ \Downarrow \\ \mathsf{P}\Big(\varliminf\limits_{n\to\infty}\frac{X_n}{\log(n)} \le \frac{1}{k} \Big) =1, \quad \forall k \in \mathbb{N}_+\\ \Downarrow \\ \mathsf{P}\Big(\varliminf\limits_{n\to\infty}\frac{X_n}{\log(n)} \le 0\Big) =1. \tag{4} \end{gather*} Therefore, from (3),(4), \begin{equation*} \varliminf\limits_{n\to\infty}\frac{X_n}{\log(n)}=0,\quad\text{a.s..} \end{equation*}

4
On

$X_n$ isn't just a measurable function. The $X_i$ are i.i.d and follow an exponential distribution. In particular, $$ P(X_{n}/\log n > a) = \exp(-a\log n) = n^{-a} \quad (a >0) $$

The value of $a = 1$ is important as for values of $a >1$, $\sum_{n=1}^\infty P(X_{n}/\log n > a) <\infty $ and for $a\leq 1$, it is the case that $\sum_{n=1}^\infty P(X_{n}/\log n > a) =\infty $. This motivates us to use the Borel Cantelli Lemma.

By this observation and the borel cantelli lemma, for each $\varepsilon >0$, ${X_n}/\log{n} > 1-\varepsilon$ infintely often with probability one and eventually ${X_n}/\log{n} < 1+\varepsilon$ with probability one. So $$ \limsup \frac{X_{n}}{\log n} = 1 \quad {\text{}a.s.} $$