Law of large numbers problem and calculation of max expected value

220 Views Asked by At

We've got to prove that $\frac{M_{n}}{\ln(n)}\rightarrow 1$ a.s. Where $M_{n}=\max\left\{{X_{1},...,X_{n}}\right\}$ , with $X_{i}\sim \mathrm{exp}(1)$ i.i.d. So it is obvious that we will have to use the Law of Large Numbers.

First I thought to rewrite it as $\frac{M_{n}}{n}\frac{n}{\ln(n)}$, and apply LLN for first fraction and take the limit for the second one, but $\frac{n}{\ln(n)}$ diverges.

My second thought was to calculate Cdf of $M_{n}$ and apply LLN for $S_{n}=\sum_{k=1}^{n}M_{k}$ and see if I take something like the following

$\lim_{n\rightarrow \infty}\frac{S_{n}}{n}\rightarrow \ln(n)$

But I don't know how to derive $\mathbb{E}(M_{n})$ (the cdf of $M_{n}$ is $(1-e^{-x})^{n}$ , for $x\geq 0$).

Im on the right track?? Any help would be great.

2

There are 2 best solutions below

0
On BEST ANSWER

it is obvious that we will have to use the Law of Large Numbers.

The law of large numbers is usually related to partial sums of random variables, while here we have maximum.

Here are some hints to solve the problem.

  1. Show that the series $\sum_n\mathbb P\left(\left\lvert \frac{M_{2^n}}{\log 2^n} -1 \right\rvert\gt \varepsilon \right)$ is convergent for all positive $\varepsilon$. Using the Borel-Cantelli lemma, this will show that $ \frac{M_{2^n}}{\log 2^n} \to 1$ almost surely.

  2. Let $R_N:= (M_{2^{N+ 1}}-M_{2^N})/\log 2^N $. Show that $R_N\to 0$ almost surely by using the first part.

  3. Conclude.

0
On

Claim: $\mathbb{E}(M_n)=H_n$, the n-harmonic number ($H_n=\sum_{i=1}^{n}\frac{1}{i}$)

Proof: Let $Y_1,...,Y_n$ the order statistics of $(X_1,...,X_n)$, and then $Y_n=M_n$.

You can prove that $Y_1=\min\{X_i\}\sim \exp(n)$, and then $\mathbb{E}(Y_1)=\frac{1}{n}$.

Now, we use the memoryless property of exponential random variables. Essentially, “resets” the values of the other random variables, so that the time between $Y_1$ and $Y_2$ is the same as the time until the first of $n-1$ iid $\exp(1)$ random variables takes on a value $\mathbb{E}(Y_2-Y_1)=\frac{1}{n-1}$.

Inductively, $\mathbb{E}(M_n)=\mathbb{E}(Y_n)=\mathbb{E}\left(Y_1+\sum_{i=1}^{n-1} (Y_{i+1}-Y_i)\right)=H_n$