I'm encountering some difficulty beginning statistics work with a basic Bayes' Rule problem. You can see the problem and answer on page 16 here, but I've explained it below.
$P(G|A,M)=\frac{P(A|G,M)P(G|M)}{P(A|M)}$
Below it says: This is Bayes’ Rule, where the probabilities are all taken as conditional.
If Bayes' Rule were done without conditioning, the left-most probability in the numerator would be $P(A,M|G)$, correct? Can someone explain the conditioning done to reach $P(A|G,M)$? I presume the two are not equivalent, or that I'm missing some basic property that explains the jump.
Thanks!
You need to apply Baye's rule several times:
$$P\{G|A,M\}=\frac{P\{G,A,M\}}{P\{A,M\}}=\frac{P\{A|G,M\}\cdot P\{G,M\}}{P\{A,M\}}=\frac{P\{A|G,M\}\cdot P\{G|M\}\cdot P\{M\}}{P\{A|M\}\cdot P\{M\}}$$