Bayesian statistics, bivariate prior distribution

229 Views Asked by At

I've got a simple question buy I'm not sure how to solve it. It's a bit long.

Suppose you've got $n$ iid random variables $X_1$, $\dots$, $X_n$ from the normal distribution with unknown mean $M$ and unknown precision (inverse variance) $H$. Then we've got the likelihood function for data $X_1=x_1$, $\dots$, $X_n=x_n$ $$L_n(\mu,h)\propto h^{n/2}\exp\left(-\frac{1}{2}h\left(n(\bar{x}-\mu)^2+S\right)\right), $$ where $\bar{x}$ is the mean of $x_1$, $\dots$, $x_n$ and $S=\sum_i(x_i-\bar{x})^2$.

Now, a bivariate prior distribution for $(M,H)$ is specified, in terms of hyperparameters $(\alpha_0,\beta_0,m_0,\lambda_0)$, as follows. The marginal distribution of $H$ is $\Gamma(\alpha_0,\beta_0)$ with density $$\pi(h)\propto h^{\alpha_0-1}e^{-\beta_0h}$$ for $h>0$, and the conditional distribution of $M$ given $H=h$ is normal with mean $m_0$ and precision $\lambda_0h$.

Now I should find the posterior joint distribution of $(M,h)$ given data $X_1=x_1$, $\dots$, $X_n=x_n$ and give the updated hyperparameters $(\alpha_n,\beta_n,m_n,\lambda_n)$ in terms of the prior hyperparameters and the data.

Could somebody please tell me how to do this? Thank you very much.

1

There are 1 best solutions below

0
On

Your prior is just a Normal-gamma distribution. The likelihood is a standard normal likelihood, but obviously taken as the product of the $n$ conditionally independent observations (with the terms in the exponent rewritten by completing the square). The prior is conjugate, so the posterior distribution is also a Normal-gamma distribution. I would advise you to work out & slog through the proportional form & posterior parameters through $$ \text{Posterior} \propto \text{Prior} \times \text{Likelihood}, $$

but if you want to verify your answer you can here (under Normal likelihood with unkown but exchangeable mean and precision).