Chernoff Type Bounds for Uniformly Bounded Conditional Random Variables

249 Views Asked by At

I am considering a set of Bernoulli RVs $X_1,X_2,\ldots,X_n$ for which we have a uniform conditional bound $\mathbb{P}[X_i=1|X_1,\ldots,X_{i-1}] \leq p$ for fixed $p$. Can we show the sum of these random variables is no larger than a the sum of i.i.d. Bernoulli random variables $Y_1,\ldots,Y_n$ with parameter $p$?

In other words, for any $C$ can we show $\mathbb{P}[\sum X_i > C] \leq \mathbb{P}[\sum Y_i > C]$?

2

There are 2 best solutions below

3
On

Yes we can. Define $p_k(X_{1,\dots,k-1}) := \mathbb{P}(X_k=1|X_1,\dots,X_{k-1}) \leq p$. Here $p_1$ is just a constant. Notice that the joint distribution of $(X_1,\dots,X_n)$ is uniquely characterized by the sequence of functions $\{p_k\}_{k=1}^n$ (This is clearly true for $n=1$. For other cases, just use induction).

Let $\{U_k\}_{k=1}^\infty$ be a sequence of i.i.d. uniform$[0,1]$ random variables. Let $\overline{X}_k = \mathbb{I}_{U_k < p_k(\overline{X}_{1,\dots,k-1})}$. Then for any $n$, $(\overline{X}_1,\dots,\overline{X}_n) \overset{(d)}{=} (X_1,\dots,X_n)$.

Let $Y_k = \mathbb{I}_{U_k < p}$. Then $\{Y_k\}$ are i.i.d. Bernoulli random variables with parameter $p$ and $Y_k \geq \overline{X}_k$ almost surely. So for any $C$, $\sum_{k=1}^n \overline{X}_k > C$ implies $\sum_{k=1}^n Y_k > C$. By monotonicity of probability measures,

$$\mathbb{P}\left(\sum_{k=1}^n X_n > C\right) = \mathbb{P}\left(\sum_{k=1}^n \overline{X}_n > C\right) \leq \mathbb{P}\left(\sum_{k=1}^n Y_k > C\right)$$

0
On

I also encountered this problem. Here is the argument I found.

Let $\Omega$ be the set of all binary sequences of length $n$ with at least $C$ ones. For $x \in \Omega$, let $Pr_{1}(x)$ denote the probability of $x$ occurring as a result of the trials $X_1, X_2, \ldots, X_n$. Suppose the last Bernoulli trial $X_n$ is replaced with an independent Bernoulli trial $Y_n$ with success probability $p$ -- denote the resulting probability of a sequence $x$ as $Pr_{2}(x)$. Any sequence in $\Omega$ whose probability decreases as a result of the change must end in a $0$. Let $x0 \in \Omega$; then it also holds that $x1$ is in $\Omega$. Further, $$ Pr_2( x1 ) + Pr_2( x0 ) = Pr_2(x) = Pr_1(x) = Pr_1(x1) + Pr_1(x0). $$ Therefore, $Pr_2( \Omega ) \ge Pr_1( \Omega )$. Continue this argument inductively to replace $X_{n-1}, X_{n-2}, \ldots X_1$ with independent Bernoulli variables $Y_{n-1}, \ldots, Y_1$ to obtain probability distributions $Pr_i$ such that $Pr_{i+1}( \Omega ) \ge Pr_i(\Omega)$. Finally,

\begin{align*} Pr \left( \sum_{i = 1}^n Y_i \ge C \right) &\ge Pr \left( \sum_{i = 1}^n X_i \ge C \right). \end{align*}