Almost sure convergence of maximum of sequence of random variables

1.6k Views Asked by At

Let $X_1, X_2, \dots$ be a sequence of i.i.d random variables from distribution $F$ with exponential tails. Denote $Y_n = \max (X_1, \dots , X_n)$. How can we prove the following: $$ \lim_{n \rightarrow \infty} \frac{Y_n}{\log n} = c $$ almost surely for some constant $c$.

And also, how can we determine what that value of $c$ is? What if we knew that the distribution $F$ is, say, a Gamma distribution (or another common distribution)?

This result seems standard, as indicated in the question here, but I could not discover how to prove it.

2

There are 2 best solutions below

3
On BEST ANSWER

For a standard exponential, we know that $Z_n=Y_n-\log(n)$ converges in distribution to a nondegenerate distribution (a Gumbel), so this means $\frac{Z_n}{\log(n)}$ converges almost surely to zero, which in turn means that $\frac{Y_n}{\log(n)}$ converges almost surely to $1.$

So the almost-sure convergence of something like this follows from the extreme value distribution. In general, for a distribution with an infinite tail that decays faster than a power law, we have that $\frac{Y_n-b_n}{a_n}$ converges in distribution to a Gumbel. For something like a Gamma, with a pure exponential tail $\sim e^{-x/\theta}$, we can work out that we have $a_n=\theta$ and $b_n$ to leading order in $n$ is $\theta\log(n).$ So for a Gamma with PDF $\frac{1}{\Gamma(\alpha) \theta^\alpha}x^{\alpha-1}e^{-x/\theta},$ $\frac{Y_n}{\log(n)}$ converges almost surely to $\theta.$

0
On

Regards to @spaceisdarkgreen's post,

Why $Z_n = Y_n - log(n)$ converges in distribution to a nondegenerate distribution (a Gumbel), so this means $Z_n/log(n)$ converges almost surely to $0$?

I can only see that $Z_n/log(n)$ converges in probability to $0$, not almost surely.