Find the asymptotic joint distribution of the MLE of $\alpha, \beta$ and $\sigma^2$

362 Views Asked by At

$X_1,\ldots,X_n$ are non-random constants. We observe random variables $Y_i,\ldots,Y_n$

that are independent but not identically distributed

$Y_i\sim N(\alpha+\beta X_i,\sigma^2)$ where $\alpha,\beta$ and $\sigma^2$ are unknown parameters

Question: Find the asymptotic joint distribution of the MLE.

The asymptotic distribution of the MLE estimator should be:

$$\sqrt n (\hat{\theta}-\theta)\sim N(0,\frac{1}{I_{\theta}})$$

and the MLE of $\alpha,\beta$ and $\sigma^2$ should be:

$$\hat{\alpha}=(\bar{y}-\hat{\beta} \bar{X})$$

$$\hat{\beta}=\frac{\sum_i^n(y_i(x-\bar{x}))}{\sum_i^n(x-\bar{x})^2}$$

$$\widehat{\sigma^2}=\frac{\sum_i^n(y_i-\hat{\alpha}-\hat{\beta} x_i)^2)}{n}$$

But how do I calculate the asymptotic joint distribution of the MLE?

1

There are 1 best solutions below

1
On

There is a multivariate version of the convergence in distribution that you mentioned.

With parameter $\theta = (\alpha, \beta, \sigma^2)$ and MLE $\hat{\theta} = (\hat{\alpha}, \hat{\beta}, \hat{\sigma^2})$, under certain conditions we have $$\sqrt{n}(\hat{\theta} - \theta) \overset{d}{\to} N(0, I^{-1})$$ where $I$ is the Fisher information matrix.