Let $X_1,\ldots,X_n$ be independent random variables with values in $[0,1]$. Let $S_n=\sum_{k=1}^n X_k$ and $m=E(S_n)$. Prove that for $t\in [m,n)$, $$P(S_n\geq t)\leq \left(\frac mt\right)^{t} \left(\frac {n-m}{n-t}\right)^{n-t}$$
This is Exercise 2.4 in Devroye's Combinatorial Methods in Density Estimation.
For $t=m$ the inequality is trivial: $P(S_n\geq m)\leq 1$, while for $t=n$ the inequality still makes sense (since $\lim_{x\to 0} x\log x = 0$) and yields $\displaystyle P(S_n\geq n)\leq \exp\left(n\log\left(\frac mn \right) \right)$.
The usual Hoeffding bound yields $\displaystyle P(S_n\geq n)\leq\exp\left(-\frac{2(n-m)^2}n \right)$. Since for $n$ large enough we have $n^2\log\left(\frac nm \right)\geq 2(n-m)^2$, the proposed concentration bound is tighter than Hoeffding's. This is surely because $t$ is constrained to be $\leq n$.
I suspect that Chernoff bound should be used and somehow tailored to account for the additional constraint on $t$.
The key is to notice that $E[X_i^k]\le E[X_i]$ for $k\ge 1$ since $X_i\in[0,1]$ almost surely. Consequently for positive $\lambda$, inequality $$E[\exp(\lambda X_i)]\le 1 + E[X_i](\lambda+\lambda^2/2+\lambda^3/6+...)=1+E[X_i](e^\lambda-1)$$ holds. Then with the Chernoff bound trick, i.e., Markov's inequality applied to $\exp(\lambda S_n)$ we get by independence $$\log P(e^{\lambda S_n}\ge e^{\lambda t})\le \sum_i \log(1 + E[X_i](e^\lambda-1)) - \lambda t.$$ At this point, $x\to\log(1+x(e^\lambda -1))$ for a fixed $\lambda$ is concave and by Jensen's inequality the right hand side is bounded by $$ n \log(1+\tfrac m n(e^\lambda-1)) - \lambda t.$$ Minimizing over $\lambda$ reveals that the bound is the sharpest when $t = n \frac{\frac{m}{n} e^\lambda}{1+\frac m n (e^\lambda -1)}$ or equivalently $e^{\lambda} = \frac t m \frac{n-m}{n-t}$. For this value of $\lambda$, inequality $\log P(e^{\lambda S_n}\ge e^{\lambda t}) \le n \log(1+\tfrac m n(e^\lambda-1)) - \lambda t$ is equivalent to the desired inequality.