In a class we looked at this example:
Let $X_1,...,X_n\sim U(0,\theta)$. Then the maximum likelihood function is
$\mathcal{L}(\theta) = \begin{cases} \dfrac{1}{\theta^{n}} & \text{if } \text{max}\{X_1,\dotsc,X_n\}\leq\theta \\ 0 & \text{otherwise} \end{cases}$
Only the fact that the maximum likelihood estimator $\hat{\theta}=\text{max}\{X_1,\dotsc,X_n\}$ is asymptotically unbiased and consistent was mentioned but I'm curious about why this is true. Could anyone help me to see how to prove this?
Thanks.
We want to obtain the distribution of the estimator $\hat{\theta} = max_{i = 1}^n{X_i}$:
$$P(max_{i=1}^n{X_i} \leq x) = \prod_{i=1}^nP(X_i \leq x) = P(X_1 \leq x)^n = (\frac{x - 0}{\theta - 0})^n = \frac{x^n}{\theta^n} I_{[0, \theta]}(x)$$
Which means that, differentiating the function with respect to $x$, we obtain:
$$P(max_{i_1}^n{X_i} = x) = \frac{nx^{n-1}}{\theta^n} I_{[0, \theta]}(x)$$
Which is exactly what we wanted, thereby:
$$f(x) = \frac{n x^{n-1}}{\theta^n} I_{[0, \theta]}(x)$$
Hence:
$$E[\hat{\theta}] = \int_{-\infty}^{+\infty} x f(x) dx = \int_{-\infty}^{+\infty} x \frac{n x^{n-1}}{\theta^n} I_{[0, \theta]}(x) dx = \int_0^\theta \frac{n}{\theta^n} x^n dx = \frac{n}{\theta^n} \frac{\theta^{n+1}}{n+1} = \frac{n}{n+1} \theta$$
Now, lets see what happens with
$$V[\hat{\theta}] = E[\hat{\theta}^2] - E[\theta]^2$$
And
$$E[\hat{\theta}^2] = \int_{-\infty}^{+\infty} x^2 f(x) dx = \int_{-\infty}^{+\infty} x^2 \frac{n x^{n-1}}{\theta^n} I_{[0, \theta]}(x) dx = \int_0^\theta \frac{n}{\theta^n} x^{n+1} dx = \frac{n}{\theta^n} \frac{\theta^{n+2}}{n+2} = \frac{n}{n+2} \theta^2$$
Thus
$$V[\hat{\theta}] = \frac{n}{n+2} \theta^2 - (\frac{n}{n+1})^2(\theta)^2 = \frac{n}{(n+2)(n+1)^2} \theta^2$$
So, we have that $E[\hat{\theta}] \to \theta$ as $n \to +\infty$, meaning that it is asymptotically unbiased. Also, $V[\hat{\theta}] \to 0$ as $n \to +\infty$, which proves that it is consistent.