Probability theory inequality

61 Views Asked by At

In a set of notes I have seen the following inequality. Let $A$ and $B$ be non-negative random variables. Then $$\mathbb{P}(A+B > \epsilon) \leq \mathbb{P}\left(A > \frac{\epsilon}{2} \right)+\mathbb{P}\left(B > \frac{\epsilon}{2} \right)$$ I do not have a probability theory background so this inequality is not obvious to me. How would one derive it?

2

There are 2 best solutions below

0
On

The key fact is that if $A + B > \epsilon$ then either $A > {\epsilon \over 2}$ or $B > {\epsilon \over 2}$. Then use that the probability of the union of two events is less than the sum of the probability of the two events.

0
On

Just think in terms of sets. If you take $\omega \in \Omega$ such that :

$$ A(\omega) + B(\omega) > \varepsilon $$

Then you have either :

$$ A(\omega) > \frac{\varepsilon}{2} \quad \quad \text{or} \quad \quad B(\omega) > \frac{\varepsilon}{2} $$

since $A$ and $B$ are two positive random variables. It means that :

$$ (A+B>\varepsilon) \subset \left(A>\frac{\varepsilon}{2} \right)\cup \left(B>\frac{\varepsilon}{2} \right) $$

And finally :

$$ \mathbb P(A+B>\varepsilon) \leqslant \mathbb P\left(A>\frac{\varepsilon}{2} \right)+\mathbb P\left(B>\frac{\varepsilon}{2} \right)$$