equating identities in baye's inference and joint density

13 Views Asked by At

Just started learning Baye's inference and confused over the basics.

Let $\theta, \phi$ be random variables with joint density $p(\theta, \phi) \propto exp(-H(\theta, \phi)) = exp(-V(\theta)) exp(-\frac{1}{2} \phi^{T} M^{-1} \phi)$

such that the marginal density of $\theta$ is simply $\propto exp(-v(\theta)))$.

Here's my understanding:

By Baye's rule: $p(\theta, \phi) = p(\theta | \phi)p(\phi) \propto p(\theta | \phi)$.

From this: $p(\theta | \phi) = exp(-V(\theta))$, $p(\phi) = exp(-\frac{1}{2} \phi^{T} M^{-1} \phi)$

But the marginal density of $\theta$ is $p(\theta) = \int_{\phi=-\infty}^{\infty} p(\theta | \phi).d\phi$.

How do I see that this marginal is proportional to $exp(-V(\theta))$?

1

There are 1 best solutions below

0
On

The joint density of $\theta$ and $\phi$ has form $f(\theta)g(\phi)$. That implies that $\theta$ and $\phi$ are independent and $p(\theta) \propto f(\theta)$, $p(\phi) \propto g(\phi)$ - see joint density and independence.