Probability that the difference of two independent Binomials with the same $n$ is negative

101 Views Asked by At

Suppose I have $X\sim \operatorname{Binom}(n,p_0)$ and $Y\sim \operatorname{Binom}(n,p_1)$ where $p_0>p_1$, I need to bound $$P(X-Y\le 0)$$ One way to do is to write $X-Y$ as a sum of i.i.d. random variables which are differences of $\operatorname{Bern}(p_0)$ and $\operatorname{Bern}(p_1)$ and use Chernoff bound to get $$P(X-Y\le 0)\le \exp\left(-\frac{1}{2}n(p_0-p_1)^2\right)$$ I know that since Chernoff bound does not utilize the variance, one of Bernstein's inequalities could give a tighter bound. I have tried the Inequality (1) in given here. However, when I did so, I found a messy exponent in the upper bound which isn't always tighter than the one given by Chernoff bound.

My main question is: Is there a way where I can prove a tighter or a normal-looking bound, with the first exponent of the form $$\exp\left(-\frac{1}{2} n\frac{(p_0-p_1)^2}{p_0(1-p_0)+p_1(1-p_1)}\right).$$

Thanks a lot!

1

There are 1 best solutions below

0
On

I believe you applied CLT and then used the Chernoff bound. That is unnecessary as for independent gaussian random variables the diff is also gaussian, so you can directly calculate the probabilities. Although it's only accurate if $n$ is large.

Solution with CLT: $$X \approx\sim N(np_0, np_0(1-p_0)) ~,~ Y \approx\sim N(np_1, np_1(1-p_1)) \text{, and}~~ X-Y \approx\sim N(n(p_0-p_1), n[p_0(1-p_0)+p_1(1-p_1)])$$ Then you calculate $\mathbb P(X-Y < 0)$ based on that.

Solution with Markov inequality(Chernoff): If you were to use the Chernoff bound on the binomials you would end up with: $$\mathbb P(X-Y\le0) = \mathbb P(Y-X\ge0) \le \mathbb E(e^{tY}) \mathbb E(e^{-tX}) = [(1-p_1+p_1e^{t})(1-p_0+p_0e^{-t})]^{n} ; \forall(t>0)$$ You can differentiate w.r.t $t$ and find the minimum of the r.h.s.