On the joint asymptotic distribution of two maximum likelihood estimators.

195 Views Asked by At

Suppose we have a IID sample $Y_1, \dots, Y_n$ with pdf $ f(y, \alpha)$ under $H_f$ and $g(y, \beta)$ under hypothesis $H_g$ (The true distribution is $f(y, \alpha)$ ).

The log-likelihood for estimating $\beta$ is as usual $L(\beta):= \sum_{i=1}^n \log( g(Y_i, \beta))$, and the maximum likelihood estimator $\hat{\beta}$ is found by maximizing $L(\beta)$.

Assume for a moment that for a given $\alpha$, $\hat{\beta}$ converges in probability to $\beta_{\alpha}$ we have from standard results that $\hat{\beta}$ is asymptotically normal with mean $\beta_{\alpha}$ and variance

$$\frac{E_{\alpha} \left[ \left( \frac{\partial \log g(Y, \ \beta_{\alpha})}{\partial \beta} \right)^2 \right]}{ \left( E_{\alpha} \left[ \frac{\partial^2 \log g(Y, \ \beta_{\alpha})}{\partial \beta^2} \right] \right)^2}$$

The way to see this is to expand $L'(\hat{\beta})$ with taylor

$$0 = L'(\hat{\beta}) = L'(\beta_{\alpha}) + L''(\beta_{\alpha})(\hat{\beta} - \beta_{\alpha}) + \frac{1}{2} L'''(\beta^*)(\hat{\beta} - \beta_{\alpha})^2$$

where $\beta^* \in (\hat{\beta} , \beta_{\alpha})$

From this we have that $$ \sqrt{n}(\hat{\beta} - \beta_{\alpha}) = \frac{ \frac{1}{\sqrt{n}} L'(\beta_{\alpha}) }{ \frac{1}{n} L''(\beta_{\alpha}) + \frac{L'''(\beta^*)}{n}(\hat{\beta} - \beta_{\alpha})^2 } $$

and we can conclude with the strong law of large numbers and the central limit theorem.

The central limit theorem can be applied because

$$E_{\alpha} \left[ \left( \frac{\partial \log g(Y, \ \beta_{\alpha})}{\partial \beta} \right) \right] =0$$

Also by differentiating wrt $\alpha$ this last equation one obtains that

$$E_{\alpha} \left[ \frac{\partial \log f(Y, \ \alpha)}{\partial \alpha} \frac{\partial \log(Y, \ \beta_{\alpha})}{\partial \beta} \right]= \frac{d\beta_{\alpha}}{d \alpha} E_{\alpha} \left[-\frac{\partial^2 \log g(Y, \ \beta_{\alpha})}{\partial \beta^2} \right]$$

This is all done in the univariate case (I am considering only $\hat{\beta}$) my question is: how can I see that the asymptotic joint distribution of $\hat{\beta},\hat{\alpha} $ is bivariate normal with covariance

$$\frac{ d \beta_{\alpha}}{d \alpha}{n E_{\alpha} \left[-\frac{\partial^2 \log f(Y, \ \alpha)}{\partial \alpha^2} \right] } $$ ?

I think this is simple and that I am out of exercise utilizing the multivariate central limit theorem, could someone explain it?

For the curious the paper where this result is stated is Test Of Separate Family of Hypotheses D.R. Cox section 7 equation (29)