Likelihood Ratio Test distribution

217 Views Asked by At

I am reading the third edition of Testing Statistical Hypotheses by Lehman and Romano. I am reading the last part of chapter 12, where there is the theorem that gives the asymptotic distribution of the likelihood ratio test under three different types of hypothesis. In the book it is only proven when the hypothesis is simple and the complex hypothesis is left to the reader, and this is what I am trying to do.

It is assumed that $X_1,\ldots,X_n$ are i.i.d. according to a q.m.d. family $\{P_\theta,\theta\in\Omega\}$, where $\Omega$ is an open subset of $\mathbb{R}^k$ and $I(\theta)$ is positive definite. The hypothesis consists in $\theta_0\in\Omega_0$, where

$$\Omega_0=\left\{\theta=(\theta_1,\ldots,\theta_k): A(\theta-a)=0 \right\},$$

where $A$ is a pxk matrix of rank $p$ and $a$ is a fixed kx1 vector.Now, in this theorem the likelihood ratio is defined as $R_n=L_n(\hat{\theta}_n)/L_n(\hat{\theta}_{n,0})$, where $\hat{\theta}_n$ is an efficient estimator of $\theta_0$. $\hat{\theta}_{n,0}$ is an efficient estimator of $\theta_0$ when $\theta_0\in\Omega_0$ (they do not necessarily correspond to the MLEs in $\Omega$ and $\Omega_0$). Then, for any $\theta_0\in\Omega_0$ $$2\log(R_n) \overset{d}{\longrightarrow} \mathcal{X}_p^2$$ where $\mathcal{X}_p^2$ means chi squared with $p$ degrees of freedom.

I have been trying to prove this but I have not succeed. Even worse, I have proven that $\log(R_n)$ goes to zero in probability and I have not realized where my mistake is.

Could someone give a hint. Thank you for your help.