Let $\Psi= \{f_\theta: \theta \in \Theta\}$ be a statistical model. Define $\Upsilon= \{T: E[T]= g(\theta)\}$ - i.e., the class of unbiased estimator of $g(\theta)$. Basically, I have two doubts:
Does an UMVUE always exist? Thanks to Rao-Blackwell theorem, we can improve the "goodness" of an unbiased estimator using a sufficient statistic, i.e. $T\mid U$ where $T$ is our unbiased estimator and $U$ our sufficient statistic. Moreover, thanks to Lehmann–Scheffé theorem, I have that if $U$ is also complete, then $T^*= E[T\mid U]$ is UMVUE. My dilemma here is that I wrote on my notes that it is not true that an UMVUE for $g(\theta)$ always exist, but I cannot understand how it is possible. If UMVUE does not always exist, it implies that a complete statistic does not always exist or an unbiased estimator of $g(\theta)$ that is function of the complete statistic does not always exist. If this is true, could you provide me a counterexample- i.e. an example where an UMVUE does not exist?
Suppose that $T$ is an efficient estimator for $g(\theta)$ - i.e. $V(T)$= Cramér-Rao lower bound. I already know that if $T$ is efficient for $g(\theta)$, then $a+bT$ is efficient for $a+bg(\theta)$ but for no other transformation. But is $g(T)$ always UMVUE for a $g(g(\theta)) \,\forall g$- i.e. if $T$ is an efficient estimator of $g(\theta)$, a transformation of $T$ is always UMVUE for a transformation of $g(\theta)$ ?
Consider a single observation $X$ having the uniform distribution on $(\theta,\theta+1)$ and suppose we have to estimate $g(\theta)$ for some function $g$.
So $X$ is minimal sufficient for $\theta$. As for completeness of $X$, notice that $$E_{\theta}[\sin (2\pi X)]=\int_{\theta}^{\theta+1}\sin (2\pi x)\,dx=0\quad,\,\forall\,\theta\in\mathbb R$$
However $\sin (2\pi X)$ is not almost surely $0$, so that $X$ is not a complete statistic.
In fact a complete sufficient statistic does not exist for this model.
To see whether UMVUE of $g(\theta)$ actually exists or not, recall the necessary-sufficient condition for an unbiased estimator (with finite second moment) to be the UMVUE which says that the unbiased estimator has to be uncorrelated with every unbiased estimator of zero.
If possible, suppose $T$ is UMVUE of $g(\theta)$. Let $\mathcal U_0$ be the class of all unbiased estimators of zero.
Clearly for every $H\in \mathcal U_0$,
$$\int_{\theta}^{\theta+1}H(x)\,dx=0\quad,\,\forall\,\theta\in\mathbb R$$
Differentiating both sides of the last equation with respect to $\theta$ gives
$$H(\theta+1)=H(\theta)\quad,\,\text{a.e.}\tag{1}$$
As $T$ is UMVUE, $E_{\theta}(TH)=0$ for all $\theta$ and for all $H\in \mathcal U_0$. In other words, $TH\in \mathcal U_0$ whenever $H\in \mathcal U_0$. So analogous to $(1)$ we have
$$T(\theta+1)H(\theta+1)=T(\theta)H(\theta)\quad,\,\text{a.e.}\tag{2}$$
And $(1)$ implies $$T(\theta)=T(\theta+1)\quad,\,\text{a.e.}\tag{3}$$
Again as $T$ is unbiased for $\theta$, $$\int_{\theta}^{\theta+1} T(x)\,dx=g(\theta)\quad,\,\forall\,\theta\in\mathbb R $$
Differentiating both sides wrt $\theta$ and equation $(3)$ yields
$$g'(\theta)=T(\theta+1)-T(\theta)=0\quad,\,\text{a.e.}$$
This shows that $g(\theta)$ does not admit a UMVUE for any non-constant $g$.
So if you take $g(\theta)=\theta$, then $T=X-\frac12$ is unbiased for $\theta$ but $T$ is not UMVUE.
As for the second question, even if $T$ is just an unbiased estimator (efficient or not) of $\theta$, it does not mean $g(T)$ is unbiased (forget UMVUE) for $g(\theta)$ for an arbitrary nonlinear function $g$.
Among several possible examples, consider i.i.d observations $X_1,\ldots,X_n$ having an exponential distribution with mean $\theta$. Then it is easy to verify that the sample mean $\overline X$ is an efficient estimator (and UMVUE) of $\theta$ but $\overline X^2$ is not UMVUE of $\theta^2$.