Let $X,Y$ be $2$ Poisson Binomial Distributions supported on $[n] = \{i | i \in \mathbb{Z}, 0\leq i\leq n \}$. Denote by $d(X,Y)$ their total variation distance. I would like to find an upper bound on the absolute value of the difference of their expected values, assuming that $d(X,Y) \leq \epsilon$, where $0<\epsilon<1$.
After some experimenting in R it seems that \begin{equation} |\mathbb{E}[X] - \mathbb{E}[Y]| \leq \epsilon \log n \end{equation} holds with probability at least $9/10$, which maybe means that the correct bound lies in $\mathcal{O}(\epsilon \log n)$.
References to such bounds or ideas on how to prove this bound (or a similar one) would be very helpfull.
Let $X,Y$ be two Poisson Binomial Distributions. Let $\mu_x = \mathbb{E}[X],\ \mu_y = \mathbb{E}[Y],\ \sigma_x^2 = Var(X),\ \sigma_y^2 = Var(Y)$. For any $\epsilon>0$, such that $\sigma_x^2 ,\sigma_y^2 \geq n\epsilon\ln(n/e)/2$, if $X,Y$ are $\epsilon$-close, that is $d(X,Y) < \epsilon$, then \begin{align*} \left| \mu_y - \mu_x \right| \leq \frac{ 2\epsilon + \sqrt{\epsilon} \left(\sigma_x +\sigma_y\right) }{1-\epsilon} = O\left(\sqrt{\epsilon} (\sigma_x +\sigma_y)\right) \end{align*}
Proof
Let $Z$ be a random variable. Denote by $F_z,\ \bar{F}_z$ the cumulative and the complementary cumulative distribution of $Z$ respectively. Define $\delta_i :=F_x(i) - F_y(i)$. Without loss of generality assume that $\mu_y > \mu_x$. \begin{align*} \mu_y - \mu_x &= \sum_{i=0}^n \left(\bar{F}_y(i) - \bar{F}_x(i)\right) \\ & = \sum_{i=0}^n\left(F_x(i) - F_y(i)\right) \\ &\leq \sum_{i=0}^{\mu_x-\sigma_x/\sqrt{\epsilon}} \delta_i + \sum_{i=\mu_x-\sigma_x/\sqrt{\epsilon}}^{\mu_y+\sigma_y/ \sqrt{\epsilon}} \delta_i + \sum_{i=\mu_y+\sigma_y/\sqrt{\epsilon}}^{n} \delta_i \end{align*} The contribution of the lower tails of $X,Y$ is at most $\epsilon$ \begin{equation*} \sum_{i=0}^{\mu_x-\sigma_x/\sqrt{\epsilon}} \delta_i \leq \sum_{i=0}^{\mu_x-\sigma_x/\sqrt{\epsilon}} F_x(i) \ \leq\ n F_x\left(\mu_x -\sigma_x/\sqrt{\epsilon}\right) \ \leq\ n \exp\left(-\frac{2\sigma_x^2}{n\epsilon}\right) \ \leq\ \epsilon \end{equation*} For the second inequality we used the Chebyshev inequality. for the last the fact $\sigma_x^2\geq n\epsilon \ln\left(n/\epsilon\right)/2$. In the same way one can show that the contribution of the upper tails is at most $\epsilon$.
Notice that $d(X,Y) \leq \epsilon$ implies $\left| F_x(i) - F_y(i) \right| \leq \epsilon$ for every $i$. \begin{equation*} \sum_{i=\mu_x-\sigma_x/\sqrt{\epsilon}}^{\mu_y+\sigma_y/ \sqrt{\epsilon}} \leq \left( \mu_y +\sigma_y/\sqrt{\epsilon} - \mu_x + \sigma_x/\sqrt{\epsilon} \right) \epsilon = \left(\mu_y - \mu_x\right) \epsilon + \left(\sigma_x +\sigma_y\right) \sqrt{\epsilon} \end{equation*} Use the above inequalities to obtain \begin{align*} \mu_y -\mu_x &\leq 2\epsilon + \left(\mu_y - \mu_x\right) \epsilon + \left(\sigma_x +\sigma_y\right) \sqrt{\epsilon} \\ & \leq \frac{ 2\epsilon + \sqrt{\epsilon}\left(\sigma_x +\sigma_y\right)}{1-\epsilon} \\ & = O\left(\sqrt{\epsilon} (\sigma_x +\sigma_y)\right) \end{align*}