Let's say that we want to find Unbiased Minimum Variance estimator of $\lambda^2$ for a sample $X=(X_1,\dots,X_n)$ from Poisson distribution. We can consider $\overline{X}^2$ and notice that
$$\mathbb{E}(\overline{X}^2) = Var(\overline{X}) + \left(\mathbb{E}(\overline{X})\right)^2 = \frac{1}{n^2} \cdot n \cdot Var(X_1) + \lambda^2 = \lambda^2 + \frac{\lambda}{n} = \lambda^2 + \mathbb{E}\left(\frac{1}{n}\overline{X} \right)$$
Which gives us that $\overline{X}^2 + \frac{1}{n}\overline{X}$ is what we are searching for: because it's a function of $\overline{X}$, which for exponential families is sufficient, and in this case also complete, statistic, then from Lehmann-Scheffe theorem we conclude that it's UMVU estimator.
Now, let's consider a function like: $$g(\lambda) = \frac{\lambda^3}{3!}e^{-\lambda}.$$
Which is actually a probability $P(X_i=3)$. What can we do with this, to find an UMVU estimator of this one? I was wondering to start, just like above, from considering
$$T(X)= \frac{\overline{X}^3}{3!}e^{-\overline{X}},$$
and eventually adjust it with some other components, but trying to find expected value of this almost made me cry. I was thinking about transforming it somehow and using Basu theorem make things easier, but not sure how.
Any advice how to do this? Or maybe I can find the UMVUE easier here?
Start from $$ g_n = \mathcal{I}\{X_1 = 3\}, $$ as an unbiased estimator. Then, using Rao-Blackwell, compute $$ g_n^{RB} = \mathbb{E}[g_n|\sum _{i=1}^n X_i =t]. $$ Note that $g_n^{RB}$ is an unbiased estimator and function of the complete minimal sufficient statistic $\sum_{i=1}^n X_i$. Thus, by Lehmann-Scheffe, it is a UMVUE.
Namely, \begin{align} g_n^{RB} &= \mathbb{E}[g_n|\sum _{i=1}^n X_i =t]\\ &= \frac{\mathbb{P}(X_1 = 3) \mathbb{P}( \sum _{i=2}^n X_i =t - 3)}{\mathbb{P}( \sum _{i=1}^n X_i =t)}\\ & =\frac{e^{-\lambda}\lambda^3/3! \times e^{-\lambda (n-1)}\lambda^{t-3} (n-1)^{t-3} / (t-3)!}{e^{-\lambda n}\lambda^t n^t/t!} \\ & = \left( \frac{n-1}{n} \right)^{t} \binom{t}{3}(n-1)^{-3} \\ & = \left( 1 - \frac{1}{n} \right)^{n\bar{x}_n} \binom{n\bar{x}_n}{3}(n-1)^{-3}. \end{align} Now, for validation, you can use the continuous mapping theorem and the WLLN. Note that $\bar{X}_n \xrightarrow{p} \lambda$, same is true for $\frac{1}{n-1}\sum X_i \xrightarrow{p} \lambda$, and note that $(1-1/n)^n \xrightarrow{} e^{-1}$. Hence, combining it all, the estimator converges to $g(\lambda)$.