Bound of Renyi divergence under addition of random variables?

73 Views Asked by At

Consider two random variables, $X$ with $p_X(x)$ and $Y$ with $p_Y(y)$. These random variables have Renyi divergence at level $\alpha$ of $R_1 = D_\alpha(p_X || p_Y)$

Now noise is added:

$X’ = X+A$

$Y’ = Y+B$

The question is: what is the lowest upper bound for $D_\alpha(p_{X’} || p_{Y’})$?

The divergences with the noise are upper bounded: $D_\alpha(p_A || p_B) \leq R_2$, $D_\alpha(p_B || p_A) \leq R_2$, $D_\alpha(p_X || p_A) \leq R_2$, $D_\alpha(p_A || p_X) \leq R_2$, $D_\alpha(p_Y || p_B) \leq R_2$, $D_\alpha(p_B || p_Y) \leq R_2$. It is assumed that $\alpha > 1$ and $R_1$, $R_2$ depend on $\alpha$.

Also we know the following: all distributions are unimodal, and the variance and mode of A and B differ by at most 10% (in other words $0.9 \leq var_A / var_B \leq 1.1$, and $-0.1 \leq (mode_A-mode_B) / mode_B \leq .1$)

Few attempts I already made:

$D_\alpha(p_X || p_Y) \geq D_\alpha(p_{X+A} || p_{Y+A})$ by Data Processing Inequality

$D_\alpha(p_{X'} || p_{Y'}) \leq R_1 + R_2$ by Theorem 9 and 28 from [1]

[1] van Erven, 2007, ‘Renyi Divergence and Kullback-Leibler Divergence’, IEEE transactions on Information Theory