Use Chebyshev's inequality in sum of Bernoulli random variables

441 Views Asked by At

I'm currently studying for my exams and doing questions related to it. This is a problem I've be stuck on for some time now and I'd like to know how to solve it.

Setting:

We have $n\geq 10$ random independents variables $X_i$ each Bernoulli distributed with probability $p_i$, i.e. $\mathbb{P}[X_i = 1] = p_i\in[0,1]$, $\mathbb{P}[X_i = 0] = 1- p_i$. We define the random variable $X := \sum X_i$. Then we have a certain number of realization of $X$, call them $t_1,\dots,t_d$.

Now assume we always have that $\sum_{i = 1}^{10} X_i \geq 1$. We want to use the first observation ($t_1$) to estimate $\mathbb{E}[X] =: \mu$.

Question:

Use Chebyshev's inequality to show that $$ \mathbb{P}\left[|t_1 - \mu|\geq \frac{\mu}{2}\right] \leq 0.4 $$

Attempt:

First, I can compute the expectation $\mu$ to be $$ \mathbb{E}[X] = \sum_{i=1}^n \mathbb{E}[X_i] = \sum_{i=1}^n p_i $$ Then, since we know that the $X_i$ are independent we can compute the variance to be $$ \text{Var}(X) = \sum_{i=1}^n \text{Var}(X_i) = \sum_{i=1}^n p_i(1-p_i) $$

Using Chebyshev's inequality gives us $$ \mathbb{P}\left[ |t_1 - \mu|\geq \frac{\mu}{2}\right] \leq \frac{\text{Var}(X)}{\mu^2}\cdot 4 $$ So it seems that we just want to show that $\text{Var}(X)/\mu^2\leq 1/10$, but from here I don't really see the nexts steps. The expression $$ \frac{\text{Var}(X)}{\mu^2} = \frac{\sum_{i} p_i(1-p_i)}{\sum_{i,j} p_i\cdot p_j} $$ doesn't seems to lead anywhere.

By the assumption that $\sum_{i=1}^{10} X_i \geq 1$, we can add $$ \mu = \mathbb{E}[X] \geq 1 \implies \frac{\text{Var}(X)}{\mu^2} \leq \text{Var}(X) $$ But even if we could compute $\text{Var}(X)$, this upper bound seems too large for the purpose of the exercise.

Do you have any idea on how to proceed here? :)