Step in proof $E_\theta T^*=E_\theta T$ of Rao-Blackwell

79 Views Asked by At

Say we have $T$ an estimator for $g(\theta)$, and $T^*=T^*(V)$ an estimator that only depends on the sufficient statistic $V$. My book claims the following: $$ E_\theta TT^*=\sum_{v}E(TT^*\mid V=v)P_\theta(V=v)=\sum_vT^*(v)E(T\mid V=v)P_\theta(V=v). $$ Now I don’t see why this holds. How did they factorise the expectation? (as in, which rule did they apply). I know the linearity of expectations, but I don’t think that holds here. And I also know that I can write $E(X+Y)=E(X)+E(Y)$ for $X,Y$ independent. But I don’t understand what happened here.

1

There are 1 best solutions below

0
On

Write $\mathcal X$ for the range of $X$ and $\mathcal Y$ for the range of $Y$. The definition of conditional expectiation for discrete random variables is $$ E(X\mid Y=y) = \sum_{x\in\mathcal X} x\cdot P(X=x\mid Y=y) $$ and this implies $$ \sum_{y\in\mathcal Y} E(X\mid Y=y)P(Y=y) = \sum_{y\in\mathcal Y} \sum_{x\in\mathcal X} x\cdot P(X=x\mid Y=y)P(Y=y) $$ The term on the right hand side can be simplified since $$ P(X=x\mid Y=y)P(Y=y) = P(X=x,Y=y). $$ Therefore $$\begin{aligned} \sum_{y\in\mathcal Y} E(X\mid Y=y)P(Y=y) &= \sum_{y\in\mathcal Y} \sum_{x\in\mathcal X} x\cdot P(X=x,Y=y)\\ &= \sum_{x\in\mathcal X} x\cdot \sum_{y\in\mathcal Y} P(X=x,Y=y)\\ &= \sum_{x\in\mathcal X} x\cdot P(X=x) \\ &= E(X). \end{aligned}$$In the second step I used that $\sum_{y\in\mathcal Y} P(X=x,Y=y) = P(X=x, Y\in \mathcal{Y}) = P(X=x)$ because these events are disjoint.

In your case, $X$ is $TT^*$ and $Y$ is called $V$.

In the second step they use $$ E(TT^* \mid V=v) = E(T \cdot T^*(V) \mid V=v) = T^*(v) E(T\mid V=v) $$ since $T^*$ is completely determined by $V$.