From wikipedia,
$$ D(x \| y)=x \ln \frac{x}{y}+(1-x) \ln \left(\frac{1-x}{1-y}\right) $$ is the Kullback-Leibler divergence between Bernoulli distributed random variables with parameters $x$ and $y$ respectively.
I need to solve for $\epsilon$ in equations of the form $$D(p+ \epsilon \| p)=n$$ for $0<n,0\leq p\leq 1$.
I don't think an analytical solution exists for this equation. (Is there?)
My next thought was to find a function $f(x,y)\leq D(x \| y)$ for which an analytical solution could be found out for $\epsilon$ in $$f(p+\epsilon,p)=n,$$
since a lower bound will suffice for my application.
The tightest lower bound that I could find (which produced an analytical solution) was,
$$D_{1 / 2}(x \| y)=-2 \log \left(\sqrt{xy}+\sqrt{(1-x)(1-y))}\right)$$
belonging to the family of Renyi divergences.
Since KL divergence is a well studied function I expect there to some known result on this. Is there any known solution to this problem?
If not, how to tackle this problem?
EDIT
My question is in relation to the Chernoff bounds. See my question (and answer) to the question on TCS stackexchange.