Given a set of n random observations $X_1,..., X_n$ on $X \sim N(\mu, 1)$, consider the problem of estimating $\mu$ under the squared-error loss using an estimator based on $\bar{X} = \frac{1}{n} \Sigma_{i=1}^n X_i$. Prove the following: an estimator of the form $t \bar{X} +s$, where $0<t<1$ and $s$ are two constants, is a Bayes estimator under some prior distribution.
I think I know how to do the above. Basically, I construct a normal prior on $\mu$, then I can get the posterior distribution of $\mu$ based on $\bar{X}$. Then I just need to compare the coefficients. I am not worried about the above question. But I don't know how to do the below question:
Question: Construct a Bayes estimator of $\phi (x|\mu)$ for a fixed $x$, under the prior distribution above, where $\phi (x|\mu)$ is the pdf of $X$.
My constructed prior at the beginning is $\mu \sim N(s(1+\frac{t}{1-t}), \frac{t}{n(1-t)})$.
For a fixed $x$, we are trying to estimate $\phi (x|\mu) = \frac{1}{\sqrt{2 \pi}} e^{-\frac{(x-\mu)^2}{2}}$. To estimate this, we only need to estimate $(x-\mu)^2$ for a fixed $x$. I don't know what to do next.