Consider the following expression in three variables, $0 \leq p,s \leq 1$ and $n >0$
$$S_{n, p, s} = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} e^{-s(k - np)^2}$$
If $s = 0$ then $S_{n, p, 0} = 1$.
Is there a closed form for the sum for $ s > 0$? If not, can it be approximated if we assume that $n$ is large?
I don't think there is a closed form. As for when $n\rightarrow\infty$, it seems that the limit is $0$ for $s>0$. Here is a sketch of the proof, hoping there are no embarrassing typos.
$S_{n,p,s}=\mathbb{E}\Big[\exp(-s(X_n-\mathbb{E}[X_n])^2\Big]$ where $\mathbb{E}$ means expectation with respect to a binomial distribution $Bi(n,p)$.
For $n$ fixed, $X_n$ can be thought as the sum of $n$ i.i.d Bernoulli random variables, say $X_n\stackrel{\text{law}}{=}B_1+\ldots + B_n$
Then the expression inside expectation becomes $\exp\Big(-sp(1-p)n\big(\frac{X_n-np}{\sqrt{p(1-p)n}}\big)^2\Big)$
By Central limit theorem $\Big(\frac{X_n-np}{\sqrt{p(1-p)n}}\Big)^2$ converges in law to $\chi^2$. There exists a coupling in which the convergence is point wise a.s. This and dominated convergence imply the convergence to $0$ if $s>0$.
Here are plot estimates of $h_n(t)=\mathbb{E}[-t(X_n-np)^2]$ obtained by sampling Bernoulli random variables with $p=0.5$. This provides empirical evidence that indeed $h_n(t)\xrightarrow{n\rightarrow\infty} 0$ for $t>0$. An R script is given below.