Find the form of the most powerful critical region of size $\alpha$.

800 Views Asked by At

A single observation $x$ is used to test the null hypothesis $\theta=\theta_0$ against alternative hypothesis $\theta=\theta_1<\theta_0$ for a geometric distribution with parameter $\theta$. Use the Neyman-Pearson lemma to find the form most powerful critical region of size $\alpha$.

Initally, I started with $$L_0=\prod_{x=1}^n \theta_0(1-\theta_0)^{x-1} = \theta_0^n(1-\theta_0)^{-n+\sum_{x=1}^n x}$$ and $$L_1=\prod_{x=1}^n \theta_1 (1-\theta_1)^{x-1} = \theta_1^n(1-\theta_1)^{-n+\sum_{x=1}^nx}$$

Then, $\dfrac{L_0}{L_1}=\dfrac{(1-\theta_0)^{\sum_{x=1}^n x}}{(1-\theta_1)^{\sum_{x=1}^n x}}\leq K$ inside the critical region and $\geq K$ outside the critical region. However, this expression is very complex, so I wasn't sure how to simplify this any further.

I then thought since it says a single observation of $x$, can I just let $x=1$? So then I end up with $L_0=\theta_0$ and $L_1=\theta_1$. However, then I get the critical region is $\frac{\theta_0}{\theta_1}\leq k$, which I didn't think made sense becaue it doesn't depend on any variables.

2

There are 2 best solutions below

2
On BEST ANSWER

My first thought was that where you wrote $\displaystyle\prod_{x=1}^n \theta_0(1-\theta_0)^{x-1}$ and $\displaystyle \sum_{x=1}^n x,$ you need $\displaystyle \prod_{i=1}^n \theta_0(1-\theta_0)^{x_i-1}$ and $\displaystyle \sum_{i=1}^n x_i$ respectively, so multiplication and addition begin at $i=1,$ not at $x=1,$ and you have $x_i$ rather than $x.$


My second thought was that you wrote "a single observation", and that means $n=1,$ so your expressions are certainly more complicated than they need to be.


So you have $$ \frac{L_0}{L_1} = \frac{\theta_0(1-\theta_0)^{x-1}}{\theta_1(1-\theta_1)^{x-1}} = \left(\frac{1-\theta_0}{1-\theta_1}\right)^{x-1}. \tag 1 $$

Since $\theta_1 < \theta_0,$ the expression $(1)$ decreases as $x$ increases. Since small values of the ratio on the left in $(1)$ favor $H_1,$ that means large values of $x$ favor $H_1.$

To figure out how large is large enough, you need to solve $$ \Pr(X>c\mid H_0) \le \alpha \text{ for } c. $$

2
On

The observation $x$ is a number in $0,1,2,\ldots $. The likelihood function is $$ \theta(1-\theta)^{x-1}$$ so the likelihood ratio is $$ \frac{\theta_0}{\theta_1}\left(\frac{1-\theta_0}{1-\theta_1}\right)^{x-1}$$ and depends on the data $x$. (And you're correct that it would be strange if it didn't.)