Lower bounding the number of children in branching process

152 Views Asked by At

Suppose we have a recursive branching process where the number of children is given by $n p$ for parameters $n$ and (probability) $p$. Each child branches $n p$ children (once) and these children perform the same process. All samples are performed independently.

Let $Y_i$ be the number of generation-$i$ children. I believe the expectation of $Y_i$ is given by $(n p)^i$. While the expectation is easy to compute, I would like to have a lower bound on the probability of $Y_i$ being less than $(np)^i / 2$.

Chernoff bounds don't seem to be applicable here since there are dependencies between children on the same generation. Is there any other concentration result that could be used instead?

2

There are 2 best solutions below

10
On

I hope this answer helps.

Firstly, note the definition of a probability generating function (PGF) , $f_X(s)$, of a random variable $X$:

$$f_X(s)=\sum_{i\in\mathbb{N}_0}P(X=i)s^i$$

So the PGF of $Y_i$, which we can call $f_i(s)$, can be evaluated at zero to get $P(Y_i=0)$.

For a binomial distributed random variable as per your parameters, the PGF of the offspring produced by a single individual takes the form $$f(s)=(ps+(1-p))^n $$

A standard result from Branching Processes is that $f_i(s)$ takes the form of the $i$ repeated compositions of the PGF of the offspring distribution:

$$f_i(s)=f(f_{i-1}(s))=f(f(f…f(s)…)$$

so long as the branching process is initiated by a single individual.

We can use these facts to calculate $P(Y_i=0)$, and use this as a lower bound. i.e.

$$P(Y_i<(np)^i/2)\geq P(Y_i=0)=f_i(0)$$

And $f_i(0)$ is calculated by recursion using:

$$f_i(0)=(pf_{i-1}(0)+(1-p))^n$$

with $f_1(0)$ being the bound that Did gave in his comment, i.e. $f_1(0)=(1-p)^n$. This bound should become quite good at large-$i$, since the branching process should either become very large, or approach extinction, so $P(Y_i<(np)^i/2)$ should approach the extinction probability, to which the expression just above should converge to.

Furthermore, note that $f_i(0)$ increases monotonically with increasing $i$ (can you see why?), so you can use the above method to calculate $f_j(0)$ for any $j\leq i$ to obtain another bound, if you don't want to calculate all the way to $f_i(0)$. In the case of $j=1$, this becomes Did's bound suggested in his comment.

EDIT: In the original answer, I suggested that the full PGF of $Y_i$ was available, and could be used to calculate a lower bound. This is indeed not the case, and so I have changed the answer accordingly.

0
On

I state an asymptotic result for $i\to \infty$, but cannot give a result for finite $i$.

Note that for $m= np$ it is true that $(m^{-i}Y_i)_{i\geq 1}$ is a nonnegative martingale. Nonnegative martingales converge almost surely, say to the random variable $W$. This includes saying that as $i\to \infty$: $$ P(m^{-i} Y_i < \frac{1}{2}) \to P(W < \frac{1}{2}) .$$

This is classical theory on branching processes; see Section I.8 in Harris' book on branching processes or page 33 of http://www.math.ubc.ca/~db5d/SummerSchool09/lectures-dd/lecture2-3.pdf which contains some ideas and references for the convergence result.

Basically, in the supercritical (i.e. $m>1$) case $P(W < 1/2) \in (0,1)$ and in the other cases $P(W<1/2) = 1$. The last fact comes from the extinction of the almost sure extinction of the process $(Y_i)_{i\geq 1}$ when $m\leq 1$.