For every two events A ⊆ S and B ⊆ S one has the bound p(A ∩ B) ≥ p(A) + p(B) − 1

59 Views Asked by At

Let S be a sample space, and let p : S → [0, 1] be a probability distribution over S. Use set theory and the definition of probability to show that, for every two events A ⊆ S and B ⊆ S one has the bound p(A ∩ B) ≥ p(A) + p(B) − 1.

I tried to apply set theory but wasn't successful to prove the above.

2

There are 2 best solutions below

2
On BEST ANSWER

$P(A)=P(A\cap B)+P(A\setminus B)$ because $A\cap B)$ and $A\setminus B$ are disjoint events whose union is $A$. Similarly, $P(B )=P(B\cap A)+P(B\setminus A)$. Add these two to get $P(A)+P(B) =2P(A\cap B) +P(A\setminus B)+P(B\setminus A)$. Now $A\cap B$, $A\setminus B$ and $A \cap B$ are disjoint and their union is $A \cup B$. Putting these together we get $P(A)+P(B) =P(A\cap B)+P(A \cup B ) \leq P(A\cap B)+1$ which gives the inequality you want.

0
On

big hint:

$$P(A \cup B)=P(A)+P(B)-P(A \cap B)$$