Showing that a statistic is minimal sufficient but not complete uniform distribution

2.9k Views Asked by At

Let $X_1, \cdots, X_n$ be iid from a uniform distribution $U[\theta-\frac{1}{2}, \theta+\frac{1}{2}]$ with $\theta \in \mathbb{R}$ unknown. Show that the statistic $T(\mathbf{X}) = (X_{(1)}, X_{(n)})$ is minimal sufficient but not complete.

I am having trouble proving that it is not complete. My idea is as follows. If I can somehow create two functions of $T(\mathbf{X})$, say $f(X_{(1)}, X_{(n)})$ and $g(X_{(1)}, X_{(n)})$ where $f \neq g$ but show that both are unbiased estimators of $\theta$, then $T(\mathbf{X}) = (X_{(1)}, X_{(n)})$ cannot possibly be complete. Is this the right approach? I am stuck because I am unsure 1) How to find the expectations/distributions of the order statistics and 2) How to construct $f$ and $g$.

Any help would be appreciated!

2

There are 2 best solutions below

1
On BEST ANSWER

The trick with these kind of problems is to apply Basu's theorem to get a contradiction. Note that $X_{(n)} - X_{(1)}$ is ancillary for $\theta$. If $(X_{(1)}, X_{(n)})$ was complete, then it would be independent of $X_{(n)} - X_{(1)}$. But this a contradiction, as knowing $(X_{(1)}, X_{(n)})$ completely determines $X_{(n)} - X_{(1)}$.

10
On

HINT

For $g(X_{(1)},X_{(n)}) = X_{(n)} - X_{(1)},$ can you show $$ E[g(X_{(1)},X_{(n)})] $$ is independent of $\theta?$ Don't calculate it, just think about it.