I've been reading Hopcroft's "Foundations of Data Science", specifically the part about probabilities, there's this exercise: (12.16 If you wish to check it, the book is free).
Let $s$ be the sum of $n$ independent random variables $x_1,x_2,...,x_n$ where for each i:
$x_i = \begin{cases} 0, \text{Prob } p \\ 1, \text{Prob } 1- p \end{cases}$
The question is:
How large must $\delta$ be if we wish to have Prob($s<(1-\delta)m) < \epsilon$ ?
What I've done so far:
The book mentions a theorem (12.3) and it states:
For any $ \displaystyle \delta > 0$, Prob($s>(1+\delta)m) < \left(\frac{e^\delta}{(1+\delta)^{(1+\delta)}}\right)^m$
So far I know the expected value of the sum is $E[s] = n(1-p)=m$
I tried making $\epsilon$ equal to the right hand of the inequation of Theorem 12.3 but I get no results, what is the appropiate way to approach this problem?