A priori normal and likelihood normal with different variances.

108 Views Asked by At

Let, be that $X_{1}$ i $X_{2}$ are, for parameter $\theta$, i.i.d and $X_{i} \mid \theta \sim \mathrm{N}\left(\theta, \sigma_{i}^{2}\right) .$ Parameter $\theta$ has distribution a priori normal. $\theta \sim \mathrm{N}\left(\mu, v^{2}\right) .$ Parameters $\sigma_{1}, \sigma_{2}$ and $v$ are known.

  • Compute posterior distribution $\theta \mid X_{1}, X_{2}$
  • Compute cov( $\left.X_{1}, X_{2}\right)$.
  • Compute marginal distribution of $\left(X_{1}, X_{2}\right) .$

My intuition: $$ P(\theta| (X_1, X_2)) \propto P(X_1|\theta)P(X_2|\theta)P(\theta) $$ Proposition: We know that the posterior for $P(X_1|\theta)P(\theta)$ is $$ \mathrm{N}\left(\mu_{x}, \sigma_{x}^{2}\right), \text { where } \mu_{x}=\frac{ v^{2} x_1+\sigma_{1}^{2} \mu}{ v^{2}+\sigma_{1}^{2}}, \sigma_{x}^{2}=\frac{\sigma_{1}^{2} v^{2}}{ v^{2}+\sigma_{1}^{2}} $$ So the next step is taking: $\theta_1 \sim \mathrm{N}\left(\mu_{x}, \sigma_{x}^{2}\right)$ as a priori distribution and compute $P(X_2|\theta)P(\theta_1)$.

But it's very complicated. It's an easy way to compute a posteriori distribution?

My attempt to compute $cov(X_1,X_2) = E(X_1 X_2) - E(X_1)E(X_2) $

We know that $E(X_i) = E(E(X_i)|\theta)) = E(\theta) = \mu$

The next step: problem is when we have $E(X_1 X_2)$. It is easy to show that $X_1$ and $X_2$ are independent, so $cov(X_1,X_2) = 0$?

1

There are 1 best solutions below

0
On BEST ANSWER

Sketch (I did not do all the calculations but just the main ones):

I would start in the following way:

$$f_{x_1x_2\theta}\propto \exp\Bigg\{-\frac{(x_1-\theta)^2}{2\sigma_1^2}-\frac{(x_2-\theta)^2}{2\sigma_2^2}-\frac{(\theta-\mu)^2}{2v^2}\Bigg\}$$

After some algebraic manipulations...

$$ \bbox[5px,border:2px solid black] { f_{x_1x_2\theta}\propto \exp\left\{-\frac{1}{2\cdot\frac{1}{\frac{1}{\sigma_1^2}+\frac{1}{\sigma_2^2}+\frac{1}{v^2}}}\left[\theta^2-2\theta\frac{\frac{x_1}{\sigma_1^2}+\frac{x_2}{\sigma_2^2}+\frac{\mu}{v^2}}{\frac{1}{\sigma_1^2}+\frac{1}{\sigma_2^2}+\frac{1}{v^2}}+?\right]\right\} \qquad (1) } $$

and a factor not depending on $\theta$ (but necessary to derive the marginal) that is

$$\exp\Bigg\{-\frac{\mu^2}{2v^2}-\frac{x_1^2}{2\sigma_1^2}-\frac{x_2^2}{2\sigma_2^2}\Bigg\}$$

Now it is easy to complete the square in (1) because the integrand is the kernel of a gaussian with

$$\text{mean}=m=\frac{\frac{x_1}{\sigma_1^2}+\frac{x_2}{\sigma_2^2}+\frac{\mu}{v^2}}{\frac{1}{\sigma_1^2}+\frac{1}{\sigma_2^2}+\frac{1}{v^2}}$$

$$\text{variance}=\tau^2=\frac{1}{\frac{1}{\sigma_1^2}+\frac{1}{\sigma_2^2}+\frac{1}{v^2}}$$

This is your Posterior: a Gaussian $N(m;\tau^2)$

After completing the square in the exponent, integrate (1) in $d\theta$ to get the marginal density of $(X_1,X_2)$ that I hope to recognize again in a bivariate joint gaussian with some parameters including correlation...