Bayes’ theorem for Gaussians (Multivariate-Gaussian)

148 Views Asked by At

I have a question regarding the Bayes'theorem of Gaussians.

Assume we have the multivariate gaussian density function:

$$x=(X1, X2)$$

$X1$ and $X2$ are random variables

$$\Sigma = \begin{bmatrix}\sigma^2 & \alpha * \sigma^2 \\ \alpha * \sigma^2 & \sigma^2 \end{bmatrix}$$

$$ f(x|\mu,\sigma) =\frac{1}{\sqrt{(2\pi)^{2}|\Sigma|}}exp(-\frac{1}{2}(x)^t\Sigma^{-1}(x))$$

I want to calculate the joint probability p(X1, X2). I found out that this is pretty hard to integrate.

Now I have found the Bayes'theorem of Gaussians that could help me to achieve this result.

$$z = (X1, X2)$$ $$p(z) = N(z| m, R^{-1})$$

$$m = \begin{bmatrix}\mu \\ A*\mu + b \\\end{bmatrix}$$

$$ R^{-1}=\begin{bmatrix}\Lambda^{-1}& \Lambda^{-1} * A^{T} \\ A * \Lambda^{-1} & L^{-1}+A*\Lambda^{-1}*A^{T} \end{bmatrix}$$

This is probably the way I could calculate $p(X1, X2)$ but I still don't know how to calculate.

$\Lambda$ and $A$ and $b$.

I have $\mu_X1$ and $\mu_X2$ given as $0$. So I could probably calculate A and b from:

$$A*0+b=0$$ But then the A needs to be a scalar and not a matrix...

I am really confused by all the variables I don't know how to calculate.

Hopefully someone with more experience can help me :)