Can't understand step in posterior mean derivation

38 Views Asked by At

I was studying the Conjugate Priors section of Berkeley's Stat260: Bayesian Modeling and Inference where, when looking at Poisson Gamma conjugacy, the following is given (page 3)

The posterior distribution has density $$ P(\theta|x, \alpha)\propto \theta^{\sum_{j} x_{j} + \alpha_{1} - 1}e^{-(\alpha_{2}+n)\theta} $$ so that $$ E[\theta|x, \alpha] = \color{#C00}{\frac{\sum_{j}x_{j}+\alpha_{1}}{n+\alpha_{2}} = \kappa\frac{\alpha_{1}}{\alpha_{2}} + (1-\kappa)\frac{\sum_{j}x_{j}}{n}} $$ where $\kappa = \alpha_{1}/(\alpha_{2}+n)$.

Embarrassingly, I can't understand the equality I emphasized in red - despite my efforts expanding the equation with the value of $\kappa$ to $$\frac{\alpha^{2}}{\beta(\beta+n)} + \frac{\sum_{j}x_{j}}{n}-\frac{\alpha\sum_{j}x_{j}}{n(\beta+n)}$$

I cannot see how the equality holds. Can someone please help me with this?

1

There are 1 best solutions below

1
On BEST ANSWER

I think there is a typo in your quoted definition of $\kappa$. If you redefine it to be $\kappa\equiv\beta/(\beta+n)$, then everything makes sense:

$$\frac{\sum_{i}x_{i}+\alpha}{n+\beta}=\frac{\sum_{i}x_{i}}{n+\beta}+\frac{\alpha}{n+\beta} = \frac{n}{n+\beta}\cdot\frac{\sum_{i}x_{i}}{n} + \frac{\alpha}{\beta}\cdot\frac{\beta}{n+\beta}$$ $$= (1-\kappa)\cdot\frac{\sum_{i}x_{i}}{n} + \kappa\frac{\alpha}{\beta}$$