I'm looking at review questions and having trouble with this one! Let $X_1,\ldots,X_n$ be i.i.d. geometric R.V.s with the pmf: $(1-p)^{x-1}p$, for $x=1,2,\ldots$ and $0<p<1$.
I need to use rao-blackwellization to find an unbiased estimator for $p(1-p)$. The examples of Rao-Blackwellization I have all involve bernoulli variables, and I'm running into difficulty when I try to generalize this. Here is what I have come up with so far:
An unbiased statistic $w$ for $p-p^2$ can be written as $\frac{1}{X_1}-\frac{1}{X_1 \cdot X_2}$, since $E(X)=\frac{1}{p}$ in a geometric distribution. Further, a quick look at the joint distribution shows $\sum{X_i}$ to be a minimal sufficient statistic.
I run into trouble when I try to set up and compute $E(w\mid t)$. With bernoulli variables, this seemed pretty straight-forward: the expected value can only be 1 or 0. In this case, however, I have no idea how to proceed. I apologize if my question is lacking detail here; the truth is I'm completely stuck. Any hints or advice on how to proceed would be greatly appreciated!
Well, I only just learned about Rao-Blackwellization.
So first, $E[X]=1/p$, but $$E[1/X]=-\frac{p\log p}{1-p},$$ so your $w$ has the wrong expected value.
Notice that $P(X=2)=(1-p)p$, so choose, for example, $w=I[X_1=2]$. (This is much like the example on Wikipedia.) Then $$ E[w\mid\sum_i X_i=t] = P(X_1=2\mid\sum_i X_i=t) = \frac{P(X_1=2,\sum_i X_i=t)}{P(\sum_i X_i=t)}. $$
So this is $$ \frac{P(X_1=2)P(\sum_{2\leq i\leq n}X_i = t-2)}{P(\sum_{1\leq i\leq n}X_i = t)}, $$ where a sum of i.i.d. geometric random variables follows a negative binomial distribution.