Calculate Bayes estimator in terms of the posterior distribution of $\boldsymbol{\theta}$ given $\boldsymbol{X}$.

121 Views Asked by At

Let $X_i$'s be independent random variables which are independently normally distributed,

$X_i | \theta_i \sim N(\theta_i, 1), i=1,2,...,n$

Assume that $\theta_i \sim g(\theta), i=1,2,...,n$ for some prior having a probability density density function $g(\theta)$ that is unknown. Let f(·) be the marginal probability density function of $X_i$'s. Namely,

$f(x_i) = \int_{-\infty}^\infty \phi(x_i - \theta) g(\theta) d\theta$, where $\phi(\cdot)$ is the pdf function of the standard normal distribution. Let $\hat{\boldsymbol{\theta}} = (\hat{\theta_1},..., \hat{\theta_n})$ be an estimator of $\boldsymbol{\theta} = (\theta_1,..., \theta_n)$.Define the squared error loss function as

$L(\hat{\boldsymbol{\theta}}, \boldsymbol{\theta}) = \Sigma_{i=1}^n (\hat{\theta_i} - \theta_i)^2$.

Let $ \hat{\boldsymbol{\theta}}^B = (\hat{\theta_1}^B,..., \hat{\theta_n})^B $ be the Bayes estimator.

Calculate $\hat{\boldsymbol{\theta}}^B$ in terms of the posterior distribution of $\boldsymbol{\theta}$ given $\boldsymbol{X}$.

My attempt: I don't have the solution of the question. But I only know this: since the loss function is squared loss, then the Bayes estimator of the posterior distribution is the expectation of posterior distribution. Is this the ending?

1

There are 1 best solutions below

0
On
  1. You cannot do Bayesian analysis with an entirely unknown and unspecified prior. You can do Bayesian analysis with an uninformative prior and a very flexible likelihood.
  2. Your model boils down to observing samples from a standard Normal where you wish to estimate the mean, while holding the standard deviation to 1.
  3. If you wish to estimate with uninformative priors, you will not get a closed form solution for the posterior because the underlying prior is improper. You can estimate this model using Stan or any one of many Bayesian estimation softwares. Any easy way to start, should you wish to play with this, is BRMS in R.
  4. If you wish to estimate with a known, conjugate prior, then you are looking for Normal priors. The posterior is Normal with parameters $\frac{1}{\frac{1}{\sigma_0^2} + \frac{n}{\sigma^2}}\left(\frac{\mu_0}{\sigma_0^2} + \frac{\sum_{i=1}^n x_i}{\sigma^2}\right), \left(\frac{1}{\sigma_0^2} + \frac{n}{\sigma^2}\right)^{-1}$, where $\mu_0$ and $\sigma_0^2$ are the mean and the variance of the prior.
  5. I understand that it is likely you are looking to learn something for some other application or maybe this is a piece of a broader problem. That said, worrying too much about the prior when all you have is samples from the Normal is a bit silly IMO. You will note that the influence of the prior fades at the rate of $n$ in the conjugate prior. That means you are likely to very rapidly converge to the true mean regardless of how flat a prior you start with.