I try to solve the following exercise:
In my town, it's rainy one third of the days. Given that it is rainy,
there will be heavy traffic with probability 1/2, and given that it is not rainy,
there will be heavy traffic with probability 1/4. If it's rainy and there is heavy traffic,
I arrive late for work with probability 1/2. On the other hand,
the probability of being late is reduced to 1/8 if it is not rainy and
there is no heavy traffic.
In other situations (rainy and no traffic, not rainy and traffic)
the probability of being late is 0.25
You pick a random day.
(a) What is the probability that it's not raining and there is heavy
traffic and I am not late?
(b) What is the probability that I am late?
(c) Given that I arrived late at work, what is the probability that it rained that day?
So we have:
R: rain, H: heavy traffic, L: being late
$P(R)=\frac{1}{3}$,
$P(H|R)=\frac{1}{2}$, $P(H|R^c)=\frac{1}{4}$,
$P(L|H\cap R)=\frac{1}{2}$, $P(L|H^C\cap R^C)=\frac{1}{8}$, $P(L|H^C\cap R)=P(L|H\cap R^C)=\frac{1}{4}$
I've solved questions (a) and (b) and my results are the same as in the solution sheet:
$(a): P(L^C\cap H\cap R^C)=1/8 $
$(b): P(L)=11/48 $
But I have a problem with the question (c). My attempt to solve it looks like this:
$ P(R|L) = \frac{P(L|R)P(R)}{P(L)}$ (Bayes' rule)
$ =[P(L|H\cap R)P(H\cap R) + P(L|H^C\cap R)P(H^C\cap R)]\frac{P(R)}{P(L)} $ (total probability)
$ =[P(L|H\cap R)P(H|R)P(R) + P(L|H^C\cap R)P(H^C|R)P(R)]\frac{P(R)}{P(L)} $
$ =[P(L|H\cap R)P(H|R)P(R) + P(L|H^C\cap R)(1-P(H|R))P(R)]\frac{P(R)}{P(L)} $
$ =(\frac{1}{2}\frac{1}{2}\frac{1}{3}+\frac{1}{4}\frac{1}{2}\frac{1}{3})\frac{1}{3}\frac{48}{11}=\frac{3}{24}\frac{1}{3}\frac{48}{11}=\frac{2}{11} $
But according to the solution sheet the correct result is $\frac{6}{11}$. So where did I commit the error? I'm quite new to probability so I don't have enough experience to find it by myself. Thank you in advance for your help.
The law of total probability $$P(L)=P(L\mid H)P(H) + P(L\mid H^c)P(H^c)\tag1 $$ continues to work when dealing with conditional probabilities. One device you can use to remember how: When conditioning on event $R$, write $P(\cdot\mid R)$ in the form $P_R(\cdot)$ so that the conditional probability looks more like a "traditional" probability. You apply the law of total probability to this $P_R$: $$P_R(L)=P_R(L\mid H)P_R(H) + P_R(L\mid H^c)P_R(H^c)\tag2$$ Now translate back: $$P(L\mid R) = P(L\mid H\cap R)P(H\mid R) + P(L\mid H^c \cap R)P(H^c\mid R)\tag3 $$ which is the form obtained by @JMoravitz.
Notice that you obtain (1) when you delete the $R$'s from (3). With practice you will be able to decompose $P(L\mid R)$ by jumping directly to (3) without the intermediate step (2).
The reason why $P_R(L\mid H) = P(L\mid H\cap R)$: $$ \begin{align} P_R(L\mid H)&\stackrel{(a)}=\frac{P_R(L\cap H)}{P_R(H)}\\ &\stackrel{(b)}=\frac{P(L\cap H\mid R)}{P(H\mid R)} =\frac{P(L\cap H\mid R)P(R)}{P(H\mid R)P(R)}\\ &\stackrel{(a)}=\frac{P(L\cap H\cap R)}{P(H\cap R)}\\ &\stackrel{(a)}=P(L\mid H\cap R)\\ \end{align} $$ Step (a) is the definition of conditional probability; step (b) is the definition of $P_R(\cdot)$.