Expectation of a sufficient statistic, conditional on another sufficient statistic

474 Views Asked by At

A random variable $X$ is distributed with pdf $f(x,θ)$. $T(x)$ is a sufficient statistic for $θ,$ and $S(x)$ is a minimal sufficient statistic for $θ.$ The following is stated in my notes without explanation: $$E_θ(T\mid S) = g(S)$$ for some function $g$ (independent of $θ$). $E_θ$ here refers to the fact that the expectation is a function of $θ.$

Intuitively this makes sense, as we should be able to express any function of $X$ (including $T(x)$) just as a function of $S,$ as it is sufficient. However, how can I more formally show this is true, using the definition of sufficiency? For example, how could I show this is true by writing the expectation as a integral (or sum) over $X$?

1

There are 1 best solutions below

3
On BEST ANSWER

It is not true that every function of the data can be expressed as a function of the minimal sufficient statistic. For example, if $X_1,\ldots,X_n \sim \text{i.i.d.} \operatorname N(\mu,1)$ and $\mu\in\mathbb R$ indexes this family of distributions, then $X_1+\cdots + X_n$ is sufficient for this family of distributions, in the sense that the conditional distribution of $(X_1,\ldots,X_n)$ given $X_1+\cdots+X_n$ does not depend on $\mu,$ but clearly one cannot express $X_1$ as a function of this sufficient statistic.

But the definition implies the conditional distribution of $T$ given $S$ does not depend on $\theta,$ and from that it follows that the conditional expectation of $T$ given $S$ does not depend on $\theta.$