Proof that mutual statistical independence implies pairwise independence

4.1k Views Asked by At

This question about pairwise vs. mutual relations is related some extant questions: here and here.

Kobayashi, Mark & Turin's Probability, Random Processes and Statistical Analysis, 2012, states without proof:

three events, A, B, C are mutually independent when:

P[A,B]=P[A]P[B], P[B,C]=P[B]P[C], P[A,C]=P[A]P[C], P[A,B,C]=P[A]P[B]P[C]

No three of these relations necessarily imply the fourth. [my italics]

However, Wikipedia and others generally agree that mutual independence implies pairwise independence, but also without a demonstration.

What is the simplest proof that mutual independence implies pairwise independence?

Note: GC Rota wrote that probability can be understood by focusing on random variables or focusing on distributions. However, the two views should be equivalent, correct?

1

There are 1 best solutions below

5
On BEST ANSWER

Mutual independence means the four identities you copied, pairwise independence means the first three of these identities. Ergo.