Approximation of region of rejection

109 Views Asked by At

Let $X_1,...,X_n$ be a random sample of size $n$ from the beta distribution $B(\theta, 1)$ whose pdf is given by $f(x;\theta)=\theta x^{\theta-1}(0<x<1)$ and the hypotheses are given by:

$H_0: \theta=1$ vs. $H_1: \theta \neq 1$

Using likelihood ratio test with significance level $\alpha$ ($0<\alpha<1$), I "already found" that$-2\sum_{i=1}^n \log X_i \sim \chi^2 (2n)$ and the rejection region is given by:

$$-2\sum_{i=1}^n \log X_i \le 2nc_1 \text{ or } -2\sum_{i=1}^n \log X_i \ge 2nc_2$$

where the constants $c_1, c_2$ satisfy $\int_{2nc_1}^{2nc_2}f_{2n}(x) \, dx = 1-\alpha$ and $c_1- \log{c_1} =c_2-\log{c_2} $, $f_{2n}$ is the pdf of $\chi^2(2n)$ distribution.

Here is the problem:

Show that the constants $c_1,c_2$ can be approximated as $c_1 \approx \dfrac{ \chi^2_{1-\alpha/2}(2n)}{2n},c_2 \approx \dfrac{ \chi^2_{\alpha/2}(2n)}{2n}$ when $n$ is sufficiently large.

I attempted to prove that the two integrals $\int_0^{2nc_1}f_{2n}(x) \, dx$ and $\int_{2nc_2}^\infty f_{2n}(x) \, dx$ both converge to $\alpha/2$ when $n \to \infty$. Since the sum of the two integrals is $\alpha$, it suffices to show that the first integral goes to $\alpha/2$.

Here is where I stuck. Does anyone have ideas? Any hints or advice will help a lot! Thanks.

2

There are 2 best solutions below

3
On

If the null hypothesis is true, then $X_1,\ldots,X_n\sim\mathrm{i.i.d.}\operatorname{Uniform}(0,1),$ so $$ \text{for } y\ge0, \quad\Pr(-2\log X_1 \ge y) = \Pr(X_1\le e^{-y/2}) = e^{-y/2}. $$ Therefore, since the exponential distribution with expected value $2$ is the chi-square distribution with $2$ degrees of freedom, we have $-2\log X_1\sim\chi^2_2,$ and so $-2\log X_1 - \cdots -2\log X_n \sim\chi^2_{2n}.$

You reject the null hypothesis if this sum is too big or too small, and you want to do that with probability $\alpha;$ hence probability $\alpha/2$ that it's too big and $\alpha/2$ that it's too small. Thus $2nc_1$ and $2nc_2$ must be so chosen that $\Pr\left( -\sum_{i=1}^n \log X_i <2nc_1 \right) = \alpha/2,$ i.e. $\Pr(\chi^2_{2n} < 2nc_1) = \alpha/2,$ and similarly for $c_2.$

So there's no need to consider a limit as $n\to\infty.$

In other words, you've done about $99\%$ of the problem already.

0
On

I finally found the solution.

The 2 key points are:

(1) Normal approximation of chi-square distribution

(2) Chi-square approximation of likelihood-ratio test(LRT) under null hypothesis

First, by point (2), Wald's approximation of LRT statistic is $n(\hat{\theta}-1)^2 = n \left(\dfrac{-\bar{\log{X}}-1}{\bar{\log{X}}} \right)^2 \simeq n(-\bar{\log{X}}-1)^2$ since $-\bar{\log{X}} \to_{p} 1 $ as $n\to \infty$ under the null hypothesis.(Here the Fisher's information is 1. Compute it.)

Therefore, the approximation of rejection region becomes $n(-\bar{\log{X}}-1)^2 \ge \chi_\alpha^2(1)=z_{\alpha/2}^2$

$\therefore -\bar{\log X} \ge 1+z_{\alpha/2}/\sqrt{n}$ or $-\bar{\log X} \le 1-z_{\alpha/2}/\sqrt{n}$, so

$-2n\bar{\log X} \ge 2n(1+z_{\alpha/2}/\sqrt{n})\simeq 2nc_1$ or $-2n\bar{\log X} \le 2n(1-z_{\alpha/2}/\sqrt{n}) \simeq 2nc_2$

The above inequalities become $\dfrac{-2n\bar{\log X}-2n}{\sqrt{4n}} \ge z_{\alpha/2}$ or $\dfrac{-2n\bar{\log X}-2n}{\sqrt{4n}} \le -z_{\alpha/2}$

Hence, by point (1), the probabilities $\mathrm{P}\left(\dfrac{-2n\bar{\log X}-2n}{\sqrt{4n}} \ge z_{\alpha/2}\right)$ and $\mathrm{P}\left(\dfrac{-2n\bar{\log X}-2n}{\sqrt{4n}} \le -z_{\alpha/2}\right)$ both converge to $\alpha/2$. This proves that $2nc_1 \simeq 2n(1+z_{\alpha/2}/\sqrt{n}) \simeq \chi_{1-\alpha/2}^2(2n)$ and $2nc_2\simeq 2n(1-z_{\alpha/2}/\sqrt{n}) \simeq \chi_{\alpha/2}^2(2n)$.