Integrating over parameter in Bayes

43 Views Asked by At

I am going over the paper "Sparse Bayesian Learning and Relevance Vector Match" by Michael Tipping. There is one equality there which I do not fully understand. He states:

$$p(t | \alpha, \sigma^2) = \int p(t|w,\sigma^2)p(w|\alpha)dw$$

I would understand it if it would be:

$$p(t | \alpha, \sigma^2) = \int p(t|w,\sigma^2, \alpha)p(w | \alpha, \sigma^2)dw$$

Does this means that the conditional $\alpha$ can be placed in either side without assuming anything about it. I do not understand why.

Thanks

1

There are 1 best solutions below

0
On

I found the answer in the previous question: A confusing excersice about Bayes' rule.

As $\omega$ is conditional only on $\alpha$, then $p(\omega | \alpha, \sigma^2) = p(\omega | \alpha)$.

Then $p(t | \omega, \alpha, \sigma^2) = p(t | \omega, \sigma^2)$ because the $\alpha$ is implicit inside the $\omega$.