Independence and conditional probability

120 Views Asked by At

Set 3 random variables A, B and C.
A and C are independent, but B and C are dependent.
Do we have $$P(A \mid B) = P(A \mid B, C)$$ (because A and C are dependent).
If yes, how to prove it?
Else, why not?

Because I have an example in my lesson like this: $$P(A).P(C \mid A).P(F \mid C) = P(A,C,F)$$ and we only know that A and F are independent.

1

There are 1 best solutions below

6
On

That the random variables A and C are independent does not imply that P(A|B)=P(A|B,C).

For a counterexample, consider some i.i.d. centered Bernoulli random variables A and B on {-1,+1}, and C=AB, again centered Bernoulli on {-1,+1}. Then the distribution of A conditionally on B is uniform Bernoulli on {-1,+1} hence A is independent of B. The distribution of A conditionally on {B,C} is Dirac at the point BC hence A is not independent of {B,C}, actually A is even measurable with respect to {B,C} since A=BC.


Nota: I wish math.SE askers of probability questions would stop using the symbols A, B, C, etc., which commonly denote events, to denote random variables. Every decent text I might have stumbled upon denotes random variables by X, Y, Z (next in line are U, V, W, possibly S and T).

Formulas such as P(A|B)=P(A|B,C) should refer to three events A, B and C, and to the conditional probabilities P(A|B)=P(A∩B)/P(B) and P(A|B,C)=P(A∩B∩C)/P(B∩C).

I cannot determine whether this strange interversion on several math.SE pages signals that the exercise was incorrectly copied here by some careless student, or if indeed some teachers are perverse enough to exchange the two notations on purpose.