Finding 4 Different Sufficient Statistics

87 Views Asked by At

If $X_1, ..., X_n$ is a random sample taken from a geometric population of the form $$ P(X = x;p) = p(1-p)^x, $$ for $x = 0, 1, 2, ...,$ and $0 < p < 1$, find four different sufficient statistics for $p$.

Attempt:

I have found the joint pmf of the random sample to be

$$ P(\mathbf{X} = \mathbf{x}; p) = p^n(1-p)^{\sum x_i}, ~~ x_i = 0, 1, 2, ..., ~~\text{and}~~ 0 < p < 1. $$

The only sufficient statistic I can think of is $T(\mathbf{X}) = \sum X_i$, the sample total.

Writing the joint pmf in exponential form, I have found

$$ P(\mathbf{X} = \mathbf{x}; p) = \operatorname{exp}\Bigl[n\text{ln}(p) + \sum x_i \text{ln}(1-p)\Bigr] $$

I don't see how it helps, though.

1

There are 1 best solutions below

3
On BEST ANSWER

Hints:

The entire sample itself is a sufficient statistic with $T(\mathbf{X}) = \mathbf{X}$ (why?).

For another sufficient statistic, recall that every minimal sufficient statistic is sufficient, and use the following theorem (Theorem 6.2.13) in Casella and Berger:

Let $f(\mathbf{x} \mid \theta)$ be the pmf or pdf of a sample $\mathbf{X}$. Suppose there exists a function $T(\mathbf{x})$ such that, for every to sample points $\mathbf{x}$ and $\mathbf{y}$, the ratio $f(\mathbf{x} \mid \theta)/f(\mathbf{y} \mid \theta)$ is constant as a function of $\theta$ if and only if $T(\mathbf{x}) = T(\mathbf{y})$. Then $T(\mathbf{X})$ is a minimal sufficient statistic for $\theta$.

If you apply this method, you will obtain that $T(\mathbf{X}) = \sum X_i$ is sufficient, but observe the following: this particular sufficient statistic makes $f(\mathbf{x} \mid \theta)/f(\mathbf{y} \mid \theta)$ constant if and only if $T(\mathbf{x}) = T(\mathbf{y})$.

Let $g: A \to B$ with $A, B \subset \mathbb{R}$ be a one-to-one function. It follows that $T(\mathbf{x}) = T(\mathbf{y})$ if and only if $g(T(\mathbf{x}) = g(T(\mathbf{y}))$. Thus $f(\mathbf{x} \mid \theta)/f(\mathbf{y} \mid \theta)$ is constant if and only if for any one-to-one function $g$, $g(T(\mathbf{x}) = g(T(\mathbf{y}))$. Thus, $g(T(\mathbf{X}))$ is minimally sufficient and thus sufficient for $\theta$.

So, if you have a minimally sufficient statistic for $\theta$, any one-to-one function of the minimally sufficient statistic for $\theta$ is also minimally sufficient, and thus sufficient.