How to prove this asymptotic limit for Branching processes in the critical case.

63 Views Asked by At

Consider a Galton Watson process with population $1=Z_0,Z_1,Z_2,\ldots$, where the offspring distribution is some random variable $X$. If $\mu = \mathbb{E}(X) =1$, then the process will die out almost surely in the long run. This link claims in Proposition 5(ii) that in this case $$ \mathbb{P}(Z_t > 0) \sim 2/ (t \operatorname{Var}(X), $$ as $t\to \infty$. They do not prove it however. So I am wondering how to prove the above statement.

If $G$ is the generating function for $X$, then I know that $r_t = \mathbb{P}(Z_t>0) = 1- G^t(0)$, where $G^t$ is the $t$'th iterate of $G$. Using a Taylor approximation we get \begin{align} r_{t+1} = 1-G(G^t(0)) &\approx 1- (G(1) + (G^t(0)-1)G'(1) + (G^t(0)-1)^2 G''(1)/2) \\ &= (1-G^t(0))+ (1-G^t(0))^2 \operatorname{Var}(X)/2\\ & = r_t - r_t^2 \sigma^2/2. \end{align}

I am not sure how to continue from here....

Thanks in advance for any help!

2

There are 2 best solutions below

1
On BEST ANSWER

We only need to show the lemma $$ \lim_{n \rightarrow \infty} \cfrac{1}{n} \big[\cfrac{1}{1 - G^{n}(s)} - \cfrac{1}{1 - s} \big] = \cfrac{\sigma^2}{2}. $$ Since $G(1) = 1$, $G'(1) = \mu = 1$, and $G^{"}(1) = \text{Var}(X) = \sigma^2$, one can use a Taylor expansion of $G$ around $1$ to get
$$ G(s) = s + \big(\cfrac{\sigma^2}{2} + \epsilon(s) \big) (1-s)^2, $$ with $\lim_{s \rightarrow 1} \epsilon(s) = 0$. Thus, we obtain $$ \cfrac{1}{1 - G(s)} - \cfrac{1}{1 - s} = \cfrac{\big(\cfrac{\sigma^2}{2} + \epsilon(s) \big) (1-s)^2}{(1 - G(s))(1 - s)} = \cfrac{1 - s}{(1 - G(s))}\big(\cfrac{\sigma^2}{2} + \epsilon(s) \big) = \cfrac{\sigma^2}{2} + \bar{\epsilon}(s), $$ where $\lim_{s \rightarrow 1} \bar{\epsilon}(s) = 0$. By evaluating the above equation in the point $s = G^{n}(0)$ and summing over $n$, we obtain $$ \cfrac{1}{n} \big[\cfrac{1}{1 - G^{n}(s)} - \cfrac{1}{1 - s} \big] = \cfrac{\sigma^2}{2} + \cfrac{1}{n} \sum_{j=0}^{n-1} \bar{\epsilon} ( G^j (0)). $$ Since $\lim_{n \rightarrow \infty} G^n (0) = 1 $ and $\lim_{s \rightarrow 1} \bar{\epsilon}(s) = 0$, we have by composition $\lim_{n \rightarrow \infty} \bar{\epsilon} ( G^n (0)) = 0 $. We can then use Cesàro lemma for example to get $ \lim_{n \rightarrow \infty} \cfrac{1}{n} \sum_{j=0}^{n-1} \bar{\epsilon} ( G^j (0)) = 0$. Thus, we deduce that $$ \lim_{n \rightarrow \infty} n \mathbb{P}[Z_n > 0 ] = \lim_{n \rightarrow \infty} n (1 - G^{n}(0)) = \cfrac{2}{\sigma^2}. $$

1
On

I think they skip the proof because it is not trivial. See the "Basic Lemma" on Page 19 of Athreya & Ney below. As you have correctly noted, $P(Z_n>0)=1-f_n(0)$, where $f_n(t)$ is the generating function (using the notation of Athreya & Ney), so the Basic Lemma becomes what you want to prove when you take $t=0$ (see Theorem 1, also on Page 19).

https://books.google.com/books?id=CE3uCAAAQBAJ&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false

The proof of the Basic Lemma starts at the bottom of page 20. Unfortunately, you don't see pages 21-22 in the preview on Google Books. You can check if you can find these pages for free elsewhere (for example Amazon), but the book costs only $13; this is the branching process Bible so it may be worth it to buy it.

https://www.amazon.com/Branching-Processes-Dover-Books-Mathematics/dp/0486434745