Doubt about definition of complete statistic

28 Views Asked by At

I know a statistic $T$ is complete if for any measurable function $g$, $E(g(T))=0$ implies $g(T)=0$ a.s. But what if I take $g(T)=T-E(T)$? The expectation of $g(T)$ is 0, but not necessarily also $g$ is. Following this, the sample mean in a normal model, for example, would not be complete. Where am I getting this wrong? Thank you

1

There are 1 best solutions below

0
On

The question has essentially already been answered in a comment. A statistic is by definition a function of the data, and the function $g$ in the definition of a complete statistic is a function of the statistic only. It cannot depend on the parameter(s). If $E(T)$ doesn’t depend on the parameter(s) and $T$ isn’t almost surely constant for all parameter values (so that your $g(T)$ isn’t almost surely zero for all parameter values), then $T$ is indeed not complete. But that’s not usually the case. In your example, the sample mean is a complete statistic for a normal distribution with unknown mean; the expected value depends on the unknown mean and thus your construction doesn’t yield a function of the statistic only.