Suppose that X1, . . . , Xn are independent identically distributed random variables with a B(m, θ) distribution where m is a known positive integer and θ is unknown.
I have shown that θ* = X1/m is unbiased for θ. and that the sum of Xi is sufficient for θ. I have then used the Rao–Blackwell theorem to find another unbiased estimator for θ, namely T/nm.
The question then states: A statistician cannot remember the exact statement of the Rao–Blackwell theorem and calculates E(T | X1) in an attempt to find an estimator of θ. Comment on the suitability or otherwise of this approach, giving your reasons.
I'm not sure how to tackle this last part!
I assume $T = \sum_{i=1}^{n}X_{i}$, i.e. the sum of Binomial random variables (which are $nm$ Bernouli random variables).
The "statistician" mixed the terms in the expected value. I'm assuming he wants to start with a starting unbiased estimator, and then improve it using a sufficient statistic and the Blackwell-Rao theorem - which would mean he would need to find:
$T^{*}(x) = \mathbb{E}(\frac{X_{1}}{m} | T(x)) = \frac{1}{m} \mathbb{E}(X_{1} | T(x))$ = $\frac{1}{m}\frac{T(x)}{n} = \frac{T(x)}{mn}$
Which is indeed what you got.
On the other hand, trying the reverse formula gets us nowhere - mainly because $X_1$ is not a sufficient statistic (i.e. it doesn't contain all the information we have from the sample):
$\mathbb{E}(T(x)|X_{1}) = \mathbb{E}(\sum{X_{i}} | X_{1}) = \sum{\mathbb{E}(X_{i}|X_{1})} = \mathbb{E}(X_1 | X_1) + \sum_2^n\mathbb{E}(X_i|X_1) = X_1 + \sum_2^n\mathbb{E}(X_i)$
We cannot continue to derive the last result only on the given statistic. $\mathbb{E}(X_i) = m\theta$, and so the sum will be $X_1 + (n-1)m\theta$, when $\theta$ is unknown.
So what the "statistician" used was useless, since he didn't condition on a sufficient statistic.