how can I show this converges to zero for some constant C as large as you need?
$$\lim\limits_{n\rightarrow\infty} \sum\limits_{k=C\sqrt{ n\log(n)}}^{n}{n \choose k } 2^{-n}$$
how can I show this converges to zero for some constant C as large as you need?
$$\lim\limits_{n\rightarrow\infty} \sum\limits_{k=C\sqrt{ n\log(n)}}^{n}{n \choose k } 2^{-n}$$
On
This is the same as stating that the asymptotic probability to get more than $C\sqrt{n\log n}$ heads in $n$ coin tosses is zero. Obviously this cannot hold, since the expected number of heads is $n/2$, that is way bigger than $C\sqrt{n\log n}$.
As a straightforward consequence of Hoeffding's inequality or Chernoff bounds, your limit is just $1$.
On
Suppose you are tossing a coin and you will get side A or B with equal probability, i.e. $\frac{1}{2}$. Note $A_n$ as the times of side A we get in $n$ trials.
The sum you give represent the probability $P(A_n \geq C\sqrt{n\log n}) = P(\frac{A_n}{n} -C\sqrt{\frac{\log n}{n}} \geq 0)$.
By the strong law of large number, $\frac{A_n}{n}$ converges to $\frac{1}{2}$ alomst surely.
Thus by dominated convergence $\lim_{n \to +\infty} P(\frac{A_n}{n} -C\sqrt{\frac{\log n}{n}} \geq 0) = P(\lim_{n \to +\infty}\frac{A_n}{n} -C\sqrt{\frac{\log n}{n}} \geq 0) = P(\frac{1}{2}>0)=1$
Therefore the sum actually converges to 1.
I would say this is not true. Note that no matter what $C$ is, asymptotically your sum will still be larger than $$ \frac{1}{2^n}\sum_{k=\frac{n}{2}-\sqrt{n}}^{\frac{n}{2}+\sqrt{n}}\binom{n}{k} \geq \frac{2\sqrt{n}}{2^n}\binom{n}{\frac{n}{2}-\sqrt{n}} \xrightarrow[n\to\infty]{}2\sqrt{\frac{2}{\pi}}e^{-2} $$