Effect of a convolution with a Bernoulli distribution on Rényi divergence

123 Views Asked by At

Let $P$ and $Q$ be two probability distributions on $\mathbb{Z}$. Let $D_\alpha(P\|Q)$ be the Rényi divergence of order $\alpha$ of $P$ and $Q$:

$$ D_\alpha(P\|Q)=\frac{1}{\alpha-1}\sum_i\frac{P(i)^\alpha}{Q(i)^{\alpha-1}} $$

when with $\alpha\gt1$, otherwise $D_1(P\|Q)$ is the Kullback-Leiber divergence between $P$ and $Q$.

Suppose that I have a bound on a $D_\alpha(P\|Q)\le\varepsilon$ for a given $\varepsilon$ and $\alpha$. Let $B_p$ be a Bernoulli distribution of parameter $p$: $B_p(0)=1-p$ and $B_p(1)=p$.

I want to find the value of $p_\min$ such that $D_\alpha(P+B_{p_\min}\|Q+B_{p_\min})$ is minimized. I suppose there is no formula in general, so my questions are:

  • Trivially, if $p=0$ or $p=1$, then the divergence stays the same; and Theorem 9 of this paper implies that the divergence is lower than for all $p$, the divergence is lower than $\varepsilon$. So $p_\min$ exists, but is it necessarily unique?
  • Can $p$ be arbitrarily close to $0$ or $1$? Or is it possible to show that it is in $[\beta,1-\beta]$ for some $\beta$, depending on $\alpha$ and $\varepsilon$?

If this helps, one can assume that $D_\infty(P\|Q)$ exists. I'm also interested in partial answers for specific values of $\alpha\lt\infty$.