Chernoff Type Bounds for a Stopped Sum of Independent Random Variables

108 Views Asked by At

Let $Y_1, \ldots, Y_n$ and $X_1, \ldots, X_n$ be i.i.d. $p$-Bernoulli random variables and let $T \in \{0, \ldots, n\}$ be a stopping time for the process. From Wald's equation, we know $$ E\left[\sum_{i=1}^T Y_i \right] = E\left[\sum_{i=1}^T X_i \right] = p \times E[T]. $$ On the other hand, if $T$ was not itself a random variable, a standard Chernoff bound would give that for $\mu = pT$ and any $0 < \delta < 1$, with probability $\geq 1-\exp(-\delta^2 \mu/3)$ we have $$ \sum_{i=1}^T Y_i \in (1\pm \delta) \sum_{i=1}^T X_i. $$ Now going back to $T$ being a stopping time and not known a priori, can we get a similar concentration bound? That is, letting $\mu$ be a given lower bound for the expected value of $\sum_{i=1}^T X_i$, does the statement above still hold?

Note: One could try to apply Chernoff bound over all $n$ choices of $T$ and set the failure probability small enough that with high probability, no matter the value of $T$, sum of the two sequences are close up to any point. Unfortunately, though, to be able to do this, the error probability would have to depend on $n$ whereas the original bound is dimension-free. Is there any better way to do this?