How do we find the asymptotic variance for the maximum likelihood estimator from the Rao-Cramer lower bound?

1.3k Views Asked by At

How do we find the asymptotic variance for the maximum likelihood estimator from the Rao-Cramer lower bound?

As a concrete example, I have found that the Rao-Cramer lower bound for $$f(x;\theta)=\frac{1}{\theta}x^{(1-\theta)/\theta}\, , 0<x<\infty\, , 0<\theta<\infty$$ is $\dfrac{\theta^2}{n}$. How do I find asymptotic variance without actually calculating the variance for the maximum likelihood estimator?

Here is the exact word of the question:

Find the Rao–Cramér lower bound, and thus the asymptotic variance of the maximum likelihood estimator $\hat\theta$, if the random sample $X_1, X_2, \cdots, X_n$ is taken from each of the distributions having the following pdf: $$f(x;\theta)=\frac{1}{\theta}x^{(1-\theta)/\theta}\, , 0<x<\infty\, , 0<\theta<\infty$$

2

There are 2 best solutions below

0
On

One of the features of an MLE, $\hat{\theta}$, is the fact that it is asymptotically efficient, i.e., its variance meets the Cramer-Rao lower bound. Therefore, you can compute $I^{-1}(\theta)$ instead of calculating the variance of $\hat{\theta}$. In our case $$ I(\theta)= - \mathbb{E}\left[\frac{\partial^2}{\theta^2}l(x;\theta)\right], $$ where $l(x;\theta)$ is the log-likelihood. Namely, $$ L(x;\theta)= \theta^n (\prod_{i=1}^n x_i)^{1/\theta-1}, $$ $$ l(x;\theta)=n\ln\theta + (1/\theta - 1)\sum\ln x_i $$ so the first derivative is $$ l' = \frac{n}{\theta} + \frac{1}{\theta^2}\sum \ln x_i $$ $$ l''=-\frac{n}{\theta^2} - \frac{2}{\theta^3}\sum\ln x_i. $$ Denote, $-ln X = Y$, thus $$ F_Y(y) = \mathbb{P}(-\ln x \le y) = 1 - \mathbb{P}(\ln x \le -y) = 1 - F_X(e^{-y}) = 1 - \int_0^{e^{-y}} \theta^{-1} x^{1/\theta - 1}dx = 1- (e^{-y})^{1/\theta} = 1 - e^{-y/\theta}, $$ i.e., $ - \ln X \sim \mathcal{E}xp(1/\theta)$. I guess you can finish the proof.

0
On

I suppose that the pdf was meant to be $$f_{\theta}(x)=\frac{1}{\theta}x^{(1-\theta)/\theta}\mathbf1_{0<x<\color{red}1}\quad,\,\theta>0$$

Provided some regularity conditions are met, asymptotic distribution of an MLE $\hat\theta$ of $\theta$ when a sample of size $n$ is drawn from a population $f_{\theta}$ is given by

$$\sqrt{n}(\hat\theta-\theta)\stackrel{a}\sim N\left(0,\frac{1}{I_{X_1}(\theta)}\right)$$

, where $I_{X_1}(\theta)=E_{\theta}\left[\frac{\partial}{\partial\theta}\ln f_{\theta}(X_1)\right]^2$ is the information in a single observation.

Your pdf $f_{\theta}$ is a member of the one-parameter exponential family, so it certainly satisfies those regularity conditions.

Now Cramer-Rao bound for $\theta$ based on the sample $\mathbf X=(X_1,\ldots,X_n)$ is given by $$\text{Crlb}=\frac{1}{I_{\mathbf X}(\theta)}$$

, where $I_{\mathbf X}(\theta)=E_{\theta}\left[\frac{\partial}{\partial\theta}\ln f_{\theta}(\mathbf X)\right]^2$ is the information in the entire sample $\mathbf X$.

Finally recall that $I_{\mathbf X}(\theta)=nI_{X_1}(\theta)$.