Bayesian inference from normal distribution

214 Views Asked by At

A friend asked me this question from Bertsekas's probability book. The question is about some algebraic manipulation at the bottom, but the entire passage is included for context
enter image description here
and $d$ is a constant that depends on $x_i$ but not $\theta$ (this last part got cut off in the image)
Our question is about why $$c_1c_2 \cdot \exp \big( -\sum_{i=0}^n \frac{(x_i- \theta)^2}{2 \sigma_i^2} \big)=d \cdot \exp \big( \frac{-( \theta -m)^2}{2v} \big)$$ As stated, this should just be by "some algebra, which involves completing the square...". Some confusions are that the square already looks complete on both sides, and whether $d$ may depend on the $\sigma_i^2$. We've tried to show this but the expressions just got uglier and uglier. Thanks in advance for any help.

1

There are 1 best solutions below

1
On BEST ANSWER

In the step that you are referencing, the square is to be completed on $\theta$: the LHS is not complete because the exponent is a sum of squares, and it needs to be rewritten as a normal likelihood with respect to the posterior distribution of $\theta$.

How does the algebra actually proceed? Consider just the summation in the LHS expression:

$$\begin{align*} \sum_{i=1}^n \frac{(x_i - \theta)^2}{2\sigma_i^2} &= \frac{1}{2} \sum_{i=1}^n \left(\frac{\theta^2}{\sigma_i^2} - \frac{2x_i}{\sigma_i^2} \theta + \frac{x_i^2}{\sigma_i^2}\right) \\ &= \frac{1}{2} \left( \frac{\theta^2}{v} - 2 \frac{m}{v} \theta + \sum_{i=1}^n \frac{x_i^2}{\sigma_i^2} \right) \\ &= \frac{1}{2v} \left( (\theta^2 - 2m \theta + m^2) - m^2 + v \sum_{i=1}^n \frac{x_i^2}{\sigma_i^2} \right) \\ &= \frac{(\theta-m)^2}{2v} + C, \end{align*}$$ where $C$ is a constant that does not depend on $\theta$. Then, when taking the exponential, $\exp(-C)$ turns into the multiplicative constant $d$ used by the text.