Let $X_i$'s be independent random variables which are independently normally distributed,
$X_i | \theta_i \sim N(\theta_i, 1), i=1,2,...,n$
Assume that $\theta_i \sim g(\theta), i=1,2,...,n$ for some prior having a probability density density function $g(\theta)$ that is unknown. Let f(·) be the marginal probability density function of $X_i$'s. Namely,
$f(x_i) = \int_{-\infty}^\infty \phi(x_i - \theta) g(\theta) d\theta$, where $\phi(\cdot)$ is the pdf function of the standard normal distribution. Let $\hat{\boldsymbol{\theta}} = (\hat{\theta_1},..., \hat{\theta_n})$ be an estimator of $\boldsymbol{\theta} = (\theta_1,..., \theta_n)$.Define the squared error loss function as
$L(\hat{\boldsymbol{\theta}}, \boldsymbol{\theta}) = \Sigma_{i=1}^n (\hat{\theta_i} - \theta_i)^2$.
Let $ \hat{\boldsymbol{\theta}}^B = (\hat{\theta_1}^B,..., \hat{\theta_n})^B $ be the Bayes estimator.
Calculate $\hat{\boldsymbol{\theta}}^B$ in terms of the posterior distribution of $\boldsymbol{\theta}$ given $\boldsymbol{X}$.
My attempt: I don't have the solution of the question. But I only know this: since the loss function is squared loss, then the Bayes estimator of the posterior distribution is the expectation of posterior distribution. Is this the ending?