Let $Y_1,Y_2,\dotsc,Y_n \sim \operatorname{Poisson}(\lambda)$. Consider $U = \frac{1}{n} \sum_{i=1}^n Y_i$. Given the conjugate Gamma prior distribution $(\alpha,\beta)$, I want to show that the posterior density $\lambda | U$ is a gamma density function.
I have been able to show that in this case, the joint likelihood for $U, \lambda$ is $$L(u, \lambda) = \frac{n^u}{u! \beta^{\alpha}\Gamma(\alpha)} \lambda^{u+\alpha-1} e^{\frac{-\lambda}{\beta/(n\beta+1)}}$$ and the marginal mass function of $U$ $$m(u) = \frac{n^u\Gamma(u+\alpha)}{u!\beta^{\alpha}\Gamma(\alpha)} \left(\frac{\beta}{n\beta +1}\right)^{u+\alpha},$$ but I am having difficulty using these formulae to solve for the posterior density. Any recommendations on this problem?
Consider the joint distribution of the sample itself $\boldsymbol y$ and the parameter $\lambda$: $$f(\boldsymbol y, \lambda) = f(\boldsymbol y \mid \lambda) f(\lambda) = \left(\prod_{i=1}^n e^{-\lambda} \frac{\lambda^{y_i}}{y_i!} \right) \frac{\beta^\alpha \lambda^{\alpha-1} e^{-\beta \lambda}}{\Gamma(\alpha)}.$$ Therefore, the likelihood is proportional to $$L(\lambda \mid \boldsymbol y) \propto e^{-n \lambda} \lambda^{n\bar y} \lambda^{\alpha - 1} e^{-\beta \lambda} = \lambda^{n \bar y + \alpha - 1} e^{-(\beta + n)\lambda}.$$ This is clearly proportional to a gamma density with posterior hyperparameters $\alpha^* = n \bar y + \alpha$, and $\beta^* = \beta + n$, where $\bar y$ is the sample mean, which also happens to be the (sufficient) statistic $U$ in your statement of the problem; therefore, in order to update the hyperparameters for $\lambda$, it suffices to know (in addition to the prior hyperparameter values $\alpha$ and $\beta$) the sample size $n$ and the sample mean $\bar y$, rather than the actual sample $\boldsymbol y$.
Why did you run into problems? The first problem is that the statistic $U = \bar y$ is not itself Poisson, as evidenced by the fact that the mean of a set of nonnegative integers is not necessarily itself a nonnegative integer. The sum of IID Poisson variables is Poisson, but not its mean.
The second problem is that your likelihood doesn't seem correct; you will want to verify the correct distribution for a gamma variable.
The third problem is that you have retained a number of factors in the likelihood that have no bearing on the parameter $\lambda$. The likelihood is a function of $\lambda$ for a fixed set of hyperparmeters and a known sample. Therefore, to obtain a posterior distribution, we would need to find a constant of proportionality that would make the likelihood integrate to $1$ over its support; but in doing so, all these constant factors will be irrelevant. So when recognizing the form of a posterior distribution, we can ignore factors like $\Gamma(\alpha)$, $\beta^\alpha$, $y_i!$ because these are fixed with respect to $\lambda$. Once we discard these, and only retain those factors that are variable functions of $\lambda$, the posterior distribution becomes evident without a need to do the integration or to find a marginal distribution.