I've started to learn methods of finding UMVUE distribution. I found some nice examples on this site.
I got stuck while evaluating UMVUE for Poisson distribution for a parametric function $g(\theta)$.

How did the author get $$\sum_{t=o}^\infty \frac{h(t) n^t}{t!} \theta^t = e^{n \theta}g(\theta)?$$ I thought $$\mathbb{E}(h(x)) = g(\theta)$$ should be considered.
Observe $$\sum_{t=0}^{\infty}\dfrac{h(t)n^t}{t!}\theta^t = \sum_{t=0}^{\infty}\dfrac{h(t)(n\theta)^t}{t!}\text{.}$$ Set $\lambda = n\theta$. Then $$\sum_{t=0}^{\infty}\dfrac{h(t)(n\theta)^t}{t!} = \sum_{t=0}^{\infty}\dfrac{h(t)\lambda^t}{t!} = \sum_{t=0}^{\infty}\dfrac{h(t)e^{-\lambda}e^{\lambda}\lambda^t}{t!} = e^{\lambda}\sum_{t=0}^{\infty}\left[h(t)\cdot \dfrac{e^{-\lambda}\lambda^t}{t!}\right]\text{.}$$ Since $T \sim P(\lambda)$ (using the notation in your image), then $$\sum_{t=0}^{\infty}\left[h(t)\cdot \dfrac{e^{-\lambda}\lambda^t}{t!}\right] = \mathbb{E}[h(T)] = g(\theta)$$ due to unbiasedness, hence $$\sum_{t=0}^{\infty}\dfrac{h(t)n^t}{t!}\theta^t = e^{\lambda}g(\theta)=e^{n\theta}g(\theta)\text{.}$$