Rao-Blackwellize an estimator of uniform distribution

858 Views Asked by At

I am trying to use the Rao-Blackwell theorem to arrive at a better estimate for $\theta$ than $\bar{X}$ given that $X \sim U[\theta-0.5,\theta-0.5]$ and sufficient statistic $T(X) = (X_{(1)}, X_{(n)})$ . This is very similar to another question asked here, so I can see what the answer should be, but I am still confused. I would like to do the following, although I can see that this brings me back to where I've started: \begin{align*} E[\bar{X} | T(X) ] &= E[X_i | X_{(1)} = x_l, X_{(n)} = x_u] \\ &= E[X_i | X_{i} = x_l] \times P(X_i = x_l) \\ &\text{ } + E[X_i | X_{i} = x_u] \times P(X_i = x_u) \\ &\text{ } + E[X_i | X_{i} \notin \{x_l, x_u \}] \times P(X_i \notin \{x_l, x_u \}) \\ &= x_l \frac{1}{n} + x_u \frac{1}{n} + \frac{\sum_{x=1}^n x_i - x_l - x_u}{n-2}\times \frac{n-2}{n} \\ &= \bar{X} \end{align*} I suppose my question can be reduced to why does $E[X_i | X_{i} \notin \{x_l, x_u \}] = \frac{x_u - x_l}{2}$? (Assuming I did everything else right.)

1

There are 1 best solutions below

4
On BEST ANSWER

Well, the question you state in the end is not exactly the right one. It should be something like

Why does $\mathbb E[X_i | X_i \in (x_l,x_u)]=\frac{x_u+x_l}{2}$.

This is because we know that $X_i\notin \lbrace X_{(1)}, X_{(n)}\rbrace$ and $X_i\in [X_{(1)},X_{(n)}]$.

Now if you agree with that, observe that any uniform random variable $X\sim U(\mathcal A)$, where $\mathcal A$ is closed, has the property that for any set $\mathcal B$ and $\mathcal C$, $\mathbb P (X\in\mathcal B | X\in\mathcal C)=\frac{\mathbb{P}(X\in\mathcal B, X\in\mathcal C)}{\mathbb{P}(X\in\mathcal C)}=\frac{\lvert A\cap B \cap C\rvert}{\lvert A \cap C\rvert}$ which makes it a uniform distribution over the set $\mathcal A \cap \mathcal C$. In your case, $\mathcal A=[\theta - 0.5, \theta+0.5]\supseteq [x_l,x_u] = \mathcal C$ and so the conditional distribution is uniform over $\mathcal A \cap \mathcal C=[x_l,x_u]$, at this point there is no conditioning and so the expectation is \begin{align*} \int_{x_l}^{x_u} \frac{x}{x_u-x_l} dx &= \left[ \frac{x^2}{2(x_u-x_l)} \right]_{x_l}^{x_u}\\ &=\frac{x_u^2-x_l^2}{2(x_u-x_l)}\\ &=\frac{(x_u-x_l)(x_u+x_l)}{2(x_u-x_l)}\\ &=\frac{x_u+x_l}{2} \end{align*}


To go back to your Rao-Blackwellization, you get \begin{align*} \mathbb{E}[\bar X|X_{(1)},X_{(n)}] &= \frac{X_{(1)}}{n} + \frac{X_{(n)}}{n} + \frac{X_{(1)}+X_{(n)}}{2} \cdot \frac{n-2}{n}\\ &=\frac{X_{(1)}+X_{(n)}}{2} \end{align*}

Which in the end is an UMVUE estimator of $\theta$ because $(X_{(1)}, X_{(n)})$ is both sufficient and complete.