Sequence of Bernoulli trials: the probability of success at the ith trial varies (or not)

650 Views Asked by At

Consider the following sequence of Bernoulli trials with the following varying probability of success at each trial:

  • At the first trial: the probability of success is $p_1 = 1 / n_1$, with $n_1 = 1$.
  • At the second trial: the probability of success is $p_2 = 1 / n_2$, with $ n_2 = \begin{cases} n_1 + 1 & \text{if the first trial was a success,} \\ n_1 & \text{if the first trial was a failure.} \\ \end{cases}$
  • $\ldots$
  • At the $i$th trial: the probability of success is $p_i = 1 / n_i$, with $ n_i = \begin{cases} n_{i-1} + 1 & \text{if the $(i-1)$th trial was a success,} \\ n_{i-1} & \text{if the $(i-1)$th trial was a failure.} \\ \end{cases}$
  • Etc.

I am trying to get my teeth into the probability mass function of this sequence and expected value of $n_i$ for a certain $i$, but I would be interested in any insightful information I could attain.

What I have so far:

  • Binary tree diagram showing a 5-trial progression of the process
  • After the $i$th trial happens:
    • There are ${i - 1 \choose k - 1}$ ways of getting $n_i = k$, with $k \in \{1, \ldots, i\}$ .

Any hint on how to find a compact expression (if any)? My main issue is generalizing the products of probabilities (elegantly) without keeping track of every single one of them "manually" on the expressions.

1

There are 1 best solutions below

1
On

The generating functions $$g_k(s)=E(s^{n_k})$$ solve the recursion $g_1(s)=s$ and $$g_{k+1}(s)=g_k(s)-(1-s)\int_0^s\frac1tg_k(t)dt$$ To characterize the sequence $(g_k)$, a common trick is to consider the series $$G_x(s)=\sum_{k=1}^\infty g_k(s)x^k$$ then the recursion on the sequence $(g_k)$ yields $$G_x(s)=xs+xG_x(s)-x(1-s)\int_0^s\frac1tG_x(t)dt$$ From there, one can show that each function $G_x$ solves a first order linear equation, hence some "explicit" formulas for $G_x$ are available. However, to extract precise informations from such formulas is not always simple...