I got stuck with the following problem which seemed not to be very complicated at the beginning!
I would like to compute the CDF of a Binomial distribution, \begin{equation*} F(\ell;n,q) = \sum_{k=0}^{\ell} \binom{n}{k} q^k (1-q)^{n-k} \end{equation*} where $\ell$ is the solution of \begin{equation*} \sum_{k=0}^{\ell} \binom{n}{k} = 2^{\alpha n}. \tag{*} \end{equation*} for some fixed $0 < \alpha \le 1$.
Of course the exact value of $F(\ell;n,q)$ is not important (and I guess it is not possible to compute). I would like to know whether $F(\ell;n,q)$ goes to $0$ as $n \to \infty$ or not?
One way to see this is to check whether $\ell > n q$ or not. However, I cannot see how one can know this given (*). More precisely, I guess there should be a relatively simple criterion that tells us whether the CDF goes to $0$ or not (as $n$ gets large) by comparing $\alpha$ and $q$ (or perhaps $h_2(q)$, ...)
Any ideas?
Thanks in advance!
If $\alpha = 1$ then $\ell(n) = n$ so $F$ is constant and equal to $1$.
Otherwise, let $\eta$ be the unique real number such that $h_2(\eta) = \alpha$ and $0 < \eta < \frac{1}{2}$. One can show that $\ell(n) \sim \eta n $ by using the Stirling formula and the very crude bounds $$ \log \binom{n}{\ell} \leq \log \sum_{k=0}^\ell \binom{n}{k} \leq \log n + \log \binom{n}{\ell}, \qquad \ell \leq \frac{n}{2}. $$ By the weak law of large numbers, we now deduce that $$ \lim_{n\to\infty}F(\ell(n);n,q) = \begin{cases}0 & \text{if } \eta < q\\1 & \text{if } \eta > q\end{cases}. $$ The case $\eta = q$ requires to know the second term of the asymptotic expansion of $\ell(n)$.