Exercise 2.5.8 Durrett's book

597 Views Asked by At

Let $X_1,X_2,...$ be i.i.d. and $E\log^{+}|X_1|<\infty$ where $\log^{+}x=max(\log x,0)$. Show that $limsup \frac{1}{n}\log |X_n|$ is zero a.s.

I am reading the solution of the problem. In the solution is written:

Let $\epsilon >0$. If $n_0$ be large enough such that $n\epsilon>1$ then

$\sum_{n\geq n_0}P(\frac{1}{n}\log|X_n|>\epsilon)=\sum_{n\geq n_0}P(\frac{1}{\epsilon}\log^{+}|X_1|>n)<\infty$.

Why does $n$ need to be large enough? I think the above expression is true for $n\geq 1$.

1

There are 1 best solutions below

3
On

It is to make sure that the expression is indeed summable and is finite with a bound independent of epsilon. To be more explicit, if $n\epsilon > 1$, then you can show that the sum is indeed upper bounded by the expectation given above (+1 for error correction). (Remember that for a discrete RV, we can write $\mathbb{E} X = \sum_{i = 1}^\infty P(X \geq i)$; you can upper bound your $\log^+ |X_n|$ random variable by a discrete random variable by taking the ceiling; now compare the two sums term-wise).