In my notes I have the following L-S theorem statement:
Let $T(X_1,...,X_n)$ be an estimator for $\theta \in \Theta$. If $T$ is:
- unbiased
- a function of complete and sufficient statistic $S_c(X_1,...X_n)$, so that we can write: $T=g(S_c(X_1,...X_n))$
$\implies$ $T$ is the unique estimator that is both unbiased and function of a complete and sufficient statistic.
My question is: does $g$ have to be a bijective function?
In my opinion, it doesn't have to be: in the proof of this theorem, this condition isn't used. However, my lecturer said it is needed. I disagree.
No, $g$ does not need to be bijective. Consider the following example from Shao's Mathematical Statistics.
Consider the problem of estimating $\nu = g(\theta)$ for a smooth function $g$ on $(0,\infty)$ in the uniform model $U(0, \theta)$. The statistic $X_{(n)}$ is a complete and sufficient statistic for $\theta$ whose density is $n^{-1}\theta^{-n}x^{n-1}I_{(0, \theta)}(x)$. By the law of the unconscious statistician, any unbiased estimator $h(X_{(n)})$ of $\nu$ must satisfy $$\theta^n g(\theta) = \int_0^\theta h(x)x^{n-1} dx$$ for all $\theta > 0$. By the fundamental theorem of calculus, differentiating with respect to $\theta$ on both sides implies that $$n\theta^{n-1} g(\theta) + \theta^n g^\prime (\theta) = h(\theta) \theta^{n-1}$$ for all $\theta >0$. Dividing through by $\theta^{n-1}$ and replacing the CSS $X_{n}$ as the argument as in the Lehmann-Scheffé theorem, we get that the form of $h$ must be $$h(X_{(n)}) = g(X_{(n)}) + n^{-1} X_{(n)} g^\prime(X_{(n)})$$
Picking any $g$ that makes this non-bijective (e.g. $g(x) = x^2 - x$) leads to a negative answer to your question.