Need some explanation for the following bayesian graph

28 Views Asked by At

I am trying to understand some lectures. A equation is presented as follows

enter image description here

I am confused with the first liner. Should it be P(w,y |X) = P(y|X,w)P(X|w) P(w) ?? If the above were correct, would it mean that P(X|w) = 1. How should we explain this?

Thanks

1

There are 1 best solutions below

4
On BEST ANSWER

This is because we assume that $p(w|X) = p(w)$ (seeing values from r.v $X$ doesn't tell anything about distribution of $w$ that you don't know before.which is pretty reasonable assumption in case of linear regression )

$$p(w,y|X) = p(y|X,w) p(w|X) = p(y|X,w) p(w)$$


EDIT:Answer to question : Is $y$ marginally independent of $x$ ? is NO

$y$ is actually function of $x$ (which direct form of dependence) Here is the rigorous proof . we will use gaussian marginal property that states as follows .

$$\int_{z} \mathcal{N}(x|Az+b, L^{-1}) \mathcal{N}(z|\mu,\Lambda^{-1}) dz = \mathcal{N}(x|A\mu+b ,A\Lambda^{-1}A^{T}+L^{-1})$$

$$p(y|x) = \int_{w} p(y|w,x) p(w|x) dw = \int_{w}p(y|w,x)p(w) dw = \int_{w} \mathcal{N}(w^{T}x , \sigma^2I) \mathcal{N}(w|0,\gamma^2.I) dw = \mathcal{N}(0,\Big(\frac{x^{T}.x}{\gamma^2} + \frac{1}{\sigma^2}\Big).I ) $$

as you can $p(y|x)$ directly depends on $x$!

If your just starting in bayesian modelling . forget about the proof part just understand from the fact that $y = w^{T}.x $ ie( directly depends on $x$)