Bound on the expected time of first success in a series of Bernoulli RVs

59 Views Asked by At

Given an infinite series of Bernoulli RVs $X_1,X_2,...$ (which may be differently distributed and mutually dependent), we are given that for every $n>0$, $$\mathbb{E}\left[\sum_{t=1}^{n}(1-X_t)\right] \leq a\log n~,$$ with $a>0$ some deterministic parameter. Intuitively, this means that when $n$ is large enough, the expected number of failures until time $n$ becomes negligible.

My question is whether this implies a bound on $\mathbb{E}\tau$ as a function of $a$ (I will be glad as long as $\mathbb{E}\tau$ is guaranteed to not be infinite), where $$\tau \triangleq \min \{t>0:X_t=1\} $$ is the time of the first success.

Thank you!

Edit: Sorry for the confusion in my original post; I changed it to clarify that the bound on $\mathbb{E}\tau$ should not depend on $n$ and that $\tau$ stands for the first success at any time.

1

There are 1 best solutions below

5
On

Let's say $X_t\sim Bern(p_t).$ Your given inequality equivalently states

$$E\left[\sum_{t=1}^n (1-X_t)\right]=n-(p_1+...+p_n)\leq a\log n.$$

Also recall since $\tau:=\min\{t:1\leq t\leq n, X_t=1\}$ is nonnegative, we have the following:

$$E[\tau]=\sum_{T=1}^n P(\tau \geq T).$$

Note for $T\geq 2$, $$P(\tau\geq T)=P(X_1=0,X_2=0,...,X_{T-1}=0)\leq P(X_{T-1}=0)=1-p_{T-1},$$

and note this inequality a) can be extended for $T=1$ by defining $p_0=0$, and b) cannot be sharpened in general because it holds with equality if $X_i$ are identically distributed and perfectly correlated such that they all share the same sign. Thus, we have

$$E[\tau]=\sum_{T=1}^n P(\tau\geq T)\leq n-(p_1+...+p_n)-(p_0-p_n)\leq a\log n-(p_0-p_n)\leq 1+a\log n.$$