I've seen the simple multiplication rule for independent events: $$P(A \cap B) = P(A)P(B)$$
used in a somewhat similar way for conditional probabilities like:
$$P( { \text{Multiple conditional probabilities occurring} } ) = p_1 \cdot p_2 \cdot _{...} \cdot p_n$$
Where $$p_i \text{ , i=1,2,..,n}$$
is the probability of a conditional probability. The individual conditional probabilities must be independent to be able to multiply, but how is the multiplication derived as a result? Is there a proof like finding the probability of independent events?
Thank you
A simple proof of the base case uses the fact that $$P(B|A)=\frac{P(B\cap A)}{P(A)} \implies P(B \cap A)=P(B|A)P(A)$$
Now, we assume that $A$ and $B$ are independent events, so $P(B|A)=P(B)$ since $B$ does not depend on $A$ at all. You can test this by, say, flipping a coin. What is the probability that your second flip is tails given your first flip is heads?
Therefore, $P(B \cap A)=P(B|A)P(A)=P(B)P(A)$ for independent events.
This can be generalized:
$$P(A_1 \cap A_2 \cap ...\cap A_n)=P(A_n|A_1\cap A_2\cap...\cap A_{n-1})...P(A_3|A_2\cap A_1)P(A_2|A_1)P(A_1)$$ $$=P(A_n)...P(A_3)P(A_2)P(A_1)$$ This is all because of the relationship between the intersection of probabilities and the conditional probability. Anytime there is a conditional probability with independent events, you can ignore the right side of the condition.
Intuitively, this makes sense when you consider the coin flips. What is the probability that we get heads, then tails? Let's call that condition $HT$. Our state space is equal to $\{TT,TH,HT,HH\}$, all with equal likelihood of happening. $$P(HT)=P(H|T)P(T)=P(H)P(T)=\frac{1}{2}\cdot\frac{1}{2}=\frac{1}{4}$$
The same for three flips: Our state space is $\{TTT,TTH,THT,THH,HTT,HTH,HHT,HHH\}$, and $$P(HHT)=P(H|HT)P(H|T)P(T)=P(H)P(H)P(T)=\frac{1}{2}\cdot\frac{1}{2}\cdot\frac{1}{2}=\frac{1}{8}$$