In Esser, Kübler, May (2017), Chernoff's bounds are used to check that the Hamming weight of something could be part of a Binomial distribution.
The relevant parts are:

I looked up Wikipedia's description of the tail bounds for Binomial distributions, but Wikipedia notes that those work for $k \le np$, in this case $c \le \tau m$. But obviously, the authors define $c > \tau m$.
I want to work out what is happening here to both improve my understanding of the paper and allow me to tweak the variables and formulae to my end. So how are the authors applying Chernoff's bounds here?
$\def\e{\mathrm{e}}\def\deq{\stackrel{\mathrm{d}}{=}}\def\peq{\mathrel{\phantom{=}}}$The best Chernoff's bound for binomial distribution will be derived first.
Suppose $X \sim B(m, p)$, then there exists i.i.d. $X_1, \cdots, X_m$ such that $X_k \sim B(1, p)$ and $X \deq \sum\limits_{k = 1}^m X_k$. For any $pm < c < m$, $a > 0$,\begin{align*} P(X \geqslant c) &= P(\e^{aX} \geqslant \e^{ac}) \leqslant \e^{-ac} E(\e^{aX}) = \e^{-ac} (E(\e^{aX_1}))^m\\ &= \exp(-ac + m \ln(p\e^a + 1 - p)). \end{align*} Define $f(a) = -ac + m \ln(p\e^a + 1 - p)$, then $f'(a) = -c + \dfrac{mp\e^a}{p\e^a + 1 - p}$. Note that $f'(a)$ is increasing, $f'(0) = mp - c < 0$, and$$ f'(a) = 0 \Longleftrightarrow a = a_0 := \ln\left( \frac{1 - p}{p} · \frac{c}{m - c} \right), $$ thus the logarithm of the best Chernoff's bound is $f(a_0)$. After rearraging terms,$$ f(a_0) = c \ln\frac{pm}{c} + (m - c) \ln\frac{(1 - p)m}{m - c}. $$
Now, for trying to derive the given bounds, making substitution $x = \dfrac{c}{pm} - 1$, $y = \dfrac{1}{p} - 1$, then $0 < x < y$ and$$ c \ln\frac{pm}{c} + (m - c) \ln\frac{(1 - p)m}{m - c} = pm\left(-(1 + x) \ln(1 + x) + (y - x) \ln\frac{y}{y - x}\right). $$ Define$$ g(x, y) = -(1 + x) \ln(1 + x) + (y - x) \ln\frac{y}{y - x}. \quad \forall 0 < x < y $$ For the first bound, it suffices to prove $g(x, y) \leqslant -\dfrac{1}{3} \min(x, x^2)$ for $0 < x < y$. Note that\begin{align*} g(x, y) &= -(1 + x) \ln(1 + x) + (y - x) \ln\left( 1 + \frac{x}{y - x} \right)\\ &\leqslant -(1 + x) \ln(1 + x) + (y - x) · \frac{x}{y - x}\\ &= x -(1 + x) \ln(1 + x). \end{align*} Define$$ G_1(x) = \frac{1}{x} (x -(1 + x) \ln(1 + x)),\quad G_2(x) = \frac{1}{x^2} (x -(1 + x) \ln(1 + x)). $$ Because $G_1'(x) = -\dfrac{1}{x^2} (x - \ln(1 + x)) \leqslant 0$, then for $x > 1$,$$ G_1(x) \leqslant G_1(1) = 1 - 2\ln 2\\ \Longrightarrow g(x, y) \leqslant x -(1 + x) \ln(1 + x) \leqslant (1 - 2\ln 2)x. $$ Next, $G_2'(x) = \dfrac{x + 2}{x^2} \left( \ln(1 + x) - \dfrac{2x}{x + 2} \right)$. Define $h(x) = \ln(1 + x) - \dfrac{2x}{x + 2}$, then$$ h'(x) = \frac{x^2}{(x + 1)(x + 2)^2} \geqslant 0 \Longrightarrow h(x) \geqslant h(0) = 0, $$ which implies $G_2$ is increasing. Thus for $0 < x \leqslant 1$,$$ G_2(x) \leqslant G_2(1) = 1 - 2\ln 2\\ \Longrightarrow g(x, y) \leqslant x - (1 + x)\ln(1 + x) \leqslant (1 - 2\ln 2)x^2. $$ Therefore, $g(x, y) \leqslant (1 - 2\ln 2) \min(x, x^2)$. Note that $1 - 2\ln 2 ≈ -0.38629 < -\dfrac{1}{3}$, thus for $p = τ$,$$ c \ln\frac{τm}{c} + (m - c) \ln\frac{(1 - τ)m}{m - c} \leqslant -\frac{1}{3} \min\left( \frac{c}{τm} - 1, \left( \frac{c}{τm} - 1 \right)^2 \right) · τm. $$
For the second bound, i.e. $p = \dfrac{1}{2}$, it suffices to prove $g(x, 1) \leqslant -\dfrac{x^2}{2}$ for $0 < x < 1$. Define $G_3(x) = g(x, 1) + \dfrac{x^2}{2}$, then$$ G_3'(x) = \ln(1 - x) - \ln(1 + x) + x,\quad G_3''(x) = -\frac{x}{1 - x} - \frac{1}{1 + x} \leqslant 0, $$ which implies $G_3'$ is decreasing. Thus $G_3'(x) \leqslant G_3'(0) = 0$, which implies $G_3$ is decreasing and $G_3(x) \leqslant G_3(0) = 0$. Therefore,$$ c\ln\frac{m}{2c} + (m - c)\ln\frac{m}{2(m - c)} \leqslant -\frac{1}{2} \left( \frac{2c}{m} - 1 \right)^2 · \frac{m}{2}. $$