Conditional Probability with four variables

97 Views Asked by At

In the book A First Course in Machine Learning, second edition, by Simon Rogers, page 322, the author states that the joint posterior density over f and y is given by:

p(f, y | t, X) = p(t|y) * p(y|f) * p(f|X) / p(t|X)

I have tried many times to get to that result using condition probability and Bayes´ rule, but to no avail. Can anyone help me?

1

There are 1 best solutions below

2
On

Since the LHS is joint posterior density, your $p(t|y)$ is a short description to $p(T=t|Y=y)$, where both $T$ and $Y$ are continuous random variables evaluated at value $t$ and $y$, respectively, and $p$ is a probability density function to which Bayes theorem still applies in the usual form. Thus from LHS we can have below where the second equality is due to conditional density function. $$ \begin{aligned} p(f, y|t, X) &= p(t | f, y, X)p(f,y|X) / p(t|X) \\ &= p(t|f, y, X)p(y|f)p(f|X) / p(t|X) \\ \end{aligned} $$

Thus to arrive at your book's conclusion it must somewhere indicates that the random variable T only depends on random variable Y and not $F$ or background $X$, so that you can have $p(t|y)=p(t|f, y, X)$ to arrive at your claimed goal.