Convergence of maximum of uniform R.V. to distribution parameter

170 Views Asked by At

Can someone please help me to prove the convergence (in probability) of the maximum of uniform R.V. to distribution parameter:

Let $X_{1}, X_{2}, ..., X_{n}$ be i.i.d. uniform R.V. in $[0, \theta]$, for some $\theta >0$. Further, let $M_{n} = \max_{i=1,2,...,n} X_{i}$. I want to prove this: $\lim_{n \to \infty} M_{n} \to \theta$.

Intuitively, it makes sense that this would occur: as we sample more and more points from the distribution, it becomes more and more likely to sample points closer to the right end of the distribution domain. More mathematical intuition is as follows,

$P(M_{n} \geq \theta ) = 0$, and

$\forall t \in [0,\theta], P(M_{n} \leq \theta -t) = \left(1-\frac{t}{\theta} \right)^n$.

The first equation tells us that the maximum is always less than $\theta$, which is obviously true, and the second equation talks about the distance of maximum from $\theta$, i.e. for really high values of $n$ and small values for $\frac{t}{\theta}$, the probability is squeezed towards the actual parameter value of $\theta$.

I guess we can use Law of Large numbers to prove this, but I am not sure how to apply it here.

Edit: I just realized, in other words, I am looking to prove that $M_{n}$ is a consistent estimator of $\theta$.

1

There are 1 best solutions below

4
On BEST ANSWER

Observe that the sequence $M_n=Max(X_1,...,X_n)$ is bounded in $[0;\theta]$ and it is monotone. In fact

$M_{n+1}=Max[M_n, X_{n+1}]\geq M_n$

So applying a known theorem $M_n \rightarrow \theta$

  • almost surely
  • in probability
  • in Distribution