Consider the following one-way random effects model:
$$X_{ij}|\mu, \alpha_i \sim N(\mu+\alpha_i, \sigma^2), i=1,...,s,j=1,...,n, \text{independent,}$$
$$\alpha_i \sim N(0, \sigma_A^2),i=1,...,s, \text{independent,}$$
$$\mu \sim \text{Uniform}(-\infty, \infty).$$
Show that the Bayes estimator of $\mu+\alpha_i$ under the squared error loss is given by
$$\frac{n \sigma_A^2}{n \sigma_A^2+ \sigma^2} \bar{X_i}+ \frac{\sigma^2}{n \sigma_A^2+\sigma^2}\bar{X},$$
where $\bar{X_i}=\frac 1n \Sigma_{j=1}^n X_{ij}$ and $\bar{X}=\frac 1s \Sigma_{i=1}^s \bar{X_i}. $
My attempt:
Say $i=1,...,s$ are s groups. Then for each group, we have n sample points (observations). For each group $i=1,...,s$, the sample mean of the group is $\bar{X_i}=\frac 1n \Sigma_{j=1}^n X_{ij}$.
Then by the conclusion from conjugate normal density, I can find the posterior distribution of $\alpha_i$ given $X_{i1},...,X_{in}$ is normally distributed $N(\frac{n \sigma_A^2}{n \sigma_A^2+ \sigma^2} \bar{X_i}, \frac{\sigma^2 \sigma_A^2}{\sigma^2 + n \sigma_A^2})$. Notice that the posterior mean is exactly the first term of the final answer. I still need to consider $\mu$ part.