From wikipedia,
Multiplicative Chernoff bound. Suppose $X_1, \dots, X_n$ are independent random variables taking values in $\{0, 1\}$. Let $X$ denote their sum and let $\mu = E[X]$ denote the sum's expected value. Then for any $\delta > 0$,
$$ \operatorname{Pr}(X \geq(1+\delta) \mu) \leq\left(\frac{e^\delta}{(1+\delta)^{1+\delta}}\right)^\mu $$
My question
Suppose I have a case where the random values take the value $1$ with a fixed probability $p$ for every round and $0$ with probability $1-p$. I was wondering for this particular scenario, if there is a better performing bound than the chernoff bound?
Since the Multiplicative Chernoff bound does not take into account the probabilities of the random variable, I was hoping this additional information could lead to a better performing bound.
Is there anything in the literature related to what I want?