Consider the Bhattacharyya distance
$$B = -\ln \sum_k \sqrt{p_k q_k},$$
where $p$ and $q$ are probability distributions, $\sum_k p_k = \sum_k q_k = 1$. I know that the Bhattacharyya distance is upper bounded by the Chernoff distance, i.e., $B \leq C$ with
$$C = \sup_{t\in[0,1]}\left\{-\ln\sum_k p_k^t q_k^{1-t}\right\}.$$
The equality occurs for symmetric distributions $p$ and $q$, in which case the supremum occurs at $t=1/2$.
By messing around, I found that $C$ also allows to lower bound the Bhattacharyya distance:
$$B \geq C/2.$$
A sketch of a proof is included below. Feel free to double check.
What I'd like to know is whether this inequality has a name and if I can have some reference for it. Maybe one of you knows. Or maybe it's just too trivial for that.
Proof: Consider the function $c(t) = -\ln\sum_k p_k^t q_k^{1-t}$. First note the $c(t)$ is a concave function of $t$ that is positive in $[0,1]$. Moreover, it vanishes at $t = 0$ and at $t=1$. Therefore, the supremum of $c(t)$ occurs at some $t^* \in [0,1]$. The value of $c(t)$ at $t^*$ is the Chernoff distance:
$$c(t^*) = C.$$
Similarly, the Bhattacharyya distance is the value of $c(t)$ at $t = 1/2$:
$$c(1/2) = B.$$
Wihtout loss of generality, focus on the case $t^* < 1/2$. The proof for $t^* > 1/2$ is similar. Consider the line that crosses $c(t)$ at the points $(1,0)$ and $(1/2,B)$. The equation of this line is $y(t) = 2B(1-t)$. Because $c(t)$ is concave and because $t^* < 1/2$, the supremum $c(t^*)$ must occur below this line:
$$c(t^*) \leq y(t^*) \Rightarrow C \leq 2B(1-t^*) \Rightarrow C \leq 2B$$
This proves the theorem.