Probability inequality for sum of non-negative independent random variables

278 Views Asked by At

Suppose $X_i$, $i \in \mathbb{N}$ are independent binary random variables with $P(X_i = 1) = p_i = 1-P(X_i = 0)$ and define $S_n \equiv \sum_{i=1}^n X_i$. I want to prove that for every $x > 0$, we have $$P(S_n \ge x) \leq \left(\frac{\mu e}{x}\right)^x $$

I can do this for $x \in (0,1]$ by noting that the function $f(y) \equiv y^x, y \ge 0$ is concave for $x$ in this range, hence we have $$P(S_n \ge x) \leq P(eS_n \ge x) \leq P(e^x S_n^x \ge x^x) \leq \frac{e^x E(S_n^x)}{x^x}\leq \frac{e^x E(S_n)^x}{x^x} = \left(\frac{\mu e}{x} \right)^x $$

where we apply Jensen's inequality to get the last inequality. I am lost about trying to get this right for $x > 1$. We can't apply Jensen's again because the function $f(y)$ is now convex on $x \in (1, \infty)$ so we need a different strategy entirely. I'm not sure if this is the right idea, but we can write down an expression for the probability exactly as being $$P(S_n \ge x) = \sum_{J \subseteq \{1, ... n\}, |J| \ge x} \prod_{i \in J} p_i \prod_{i \not \in J} (1-p_i) $$ I can't see anything fruitful from this though. Any help would be much appreciated!

1

There are 1 best solutions below

1
On BEST ANSWER

Suppose $x > \mu$, because if $x \le \mu$, then the right hand side is bigger than $1$.

I follow the proof of Bernstein's inequality: https://en.wikipedia.org/wiki/Bernstein_inequalities_(probability_theory)

For any $ \theta > 0$, we have $$ P(S \ge x) = P(\exp(\theta(S-x)) \ge 1) \le E(\exp(\theta(S-x))) = e^{-\theta x} \prod_k E(\exp(\theta X_k)) .$$ Now $$ E(\exp(\theta X_k)) = 1-p_k + p_k e^{\theta} \le \exp((e^{\theta} - 1) p_k ) .$$ So $$ P(S \ge x) \le \exp(-\theta x + (e^\theta -1) \mu ) \le \exp(-\theta x + e^\theta \mu ) .$$ Set $\theta = \log (x/\mu)$.