minimal sufficient statistic

446 Views Asked by At

Let X ~ Ber(n1; p), Y ~ Ber(n2; p^2), where X and Y are independent. Find a minimal sufficient statistic T and, using a nontrivial function, show that it is not complete.

I get confused by having two different distribution! is it true to simply multiply these two distribution and use factorization theory?

1

There are 1 best solutions below

2
On BEST ANSWER

First write the likelihood function: $$f(x,y|p) = [p^x(1-p)^{1-x}][p^{2y}(1-p^2)^{1-y}]$$ From this use the Lehmann-Scheffe Theorem. Need to find $T$ such that $$T(x_1, y_1) = T(x_2, y_2) \Leftrightarrow \frac{f(x_1,y_1|p)}{f(x_2,y_2|p)} = c(x_1, y_1, x_2, y_2)$$ To this end, let's compute the likelihood ratio: \begin{align} \frac{f(x_1,y_1|p)}{f(x_2,y_2|p)} &= \frac{p^{x_1}(1-p)^{1-x_1}p^{2y_1}(1-p^2)^{1-y_1}}{p^{x_2}(1-p)^{1-x_2}p^{2y_2}(1-p^2)^{1-y_2}} \\ &= p^{(x_1 - x_2)} (1-p)^{x_2 - x_1}p^{2(y_1 - y_2)} (1-p^2)^{y_2-y_1} \end{align} It can be seen then that the ratio does not depend on $p$ if and only if $x_1 = x_2, y_1 = y_2$. By the Lehmann-Scheffe, this implies that $T(X, Y) = (X, Y)$ is the minimal sufficient statistic.