Branching probabilities part 2

85 Views Asked by At

Following up on this question Branching process probabilities

(This is the continuation of this question)

$\underline{\text{**Extinction Probability**}}$

We would like to find the following probability:

$\Bbb P($ Population becomes extinct $| X_0 =1)$= $\Bbb P(X_n=0$ for some $n | X_0=1)$

Firstly, look at $\Bbb E_1(X_n)$ - This is the "Expected value of individuals at time n, given that we started with 1 individual time 0".

$$\Bbb E_1(X_n)= \frac{d}{ds}F_n(s)\Bigr|_{\substack{s\uparrow 1}}$$

$$=\frac{d}{ds} \underbrace{G \circ G \circ \cdots \circ G(s)}_{n \text{ times}}\Bigr|_{\substack{s\uparrow 1}}=\tag{1}$$

and by chain rule

$$= (G^{'}(s))^n\Bigr|_{\substack{s\uparrow 1}}\tag{2}$$

How are $1$ and $2$ derived?

$\Rightarrow \Bbb E(X_n)= \mu^n$

How is this derived?

Now let $$f^{*}_{k,0}= \Bbb P(X_n= 0 \text{ for some $n$ } | X_0=k)=$$

by independence

$$\Bigl(\Bbb P(X_n=0 \text{ for some $n$ } | X_0 =1) \Bigr)^k= \Bigl(f^{*}_{1,0}\Bigr)^k$$

Now,

$$f^{*}_{1,0}= \Bbb P(\bigcup^{\infty}_{m =1} \{X_{m+j}=0 \forall j \geq 0\} | X_0=t)$$

$$=\text{lim}_{m \rightarrow \infty} \Bbb P(X_{m+j} = 0 \forall j \geq 0 | X_0=1)$$

since state $\{0\}$ is an absorbing state

$$=\text{lim}_{m \rightarrow \infty} \Bbb P(X_m=0 | X_0=1)$$

$$=\text{lim}_{m \rightarrow \infty} F_m(0)$$

I dont't quite get how we got the above?

$$=\text{lim}_{m \rightarrow \infty} G(F_{m-1}(0))$$

So, we have an iterative formula for $x_n := \Bbb P(X_n=0 | X_n=1)$

With $$x_0$$

$$x_1=\Bbb P(X_1=0 | X_0=1) = G(0) = \Bbb P(Z=0)$$

Then, by iterating, we have $x_2=G(x_1)$, $x_3=G(x_2)$

All of the things which I did not question explicitly I understand how are deduced, at least algebraically. If someone could clarify my questions above, that would be great! Any help appreciated

1

There are 1 best solutions below

0
On BEST ANSWER

$\newcommand{\EE}{\mathbf{E}}$ $\newcommand{\PP}{\mathbb{P}}$ $\newcommand{\NN}{\mathbb{N}}$

I'm following the footsteps of G. Grimmett, "Probability and random processes". Let $X_1, X_2, ..., X_n$ be generation sizes and $G_n(s) = \EE( s^{X_n} )$ be generating function of $X_n$. Since $\text{supp}\ X_n = \NN_+$, we have $G_n(s) = \sum_{i=0}^{\infty} s^k p^n_k$ where $p^n_k$ is probability that in $n$-th generation we have population of size $k$

Following will be useful:

For $S_N = \sum_{i=0}^N \eta_i$ where $\{ \eta_i \}_{i \geq 0}$ is collection of iid random variables of generating function $F_{\eta}$, and let $N$ be discrete random variable of generating function $F_N$. Then $F_{S_N}(s) = F_N( F_{\eta})$

This is true, because: $$F_S(s) = \EE s^{S_n} = \EE ( \EE (s^{S_n}) \vert N = n ) = \sum_{n \geq 0} \EE s^S \vert N = n) \PP(N=n)$$

By independence of increments and identicality of distribution we obtain: $$ \sum_{n \geq 0} F_{\eta}^n(s) \PP(N = n) = F_N( F_{\eta} )$$ qed

But for $G$ being generating function of branching random walk we have $G_{m+n}(s) = G_m(G_n(s)) = G_n(G_m(s))$ and as such $G_n(s)$ is $n$-fold iterate of $G$. Note that we can decompose generation size in step $n+m$ in a following way: $$X_{n+m} = Z_1 + Z_2 + ... + Z_{X_n}$$

where $Z_i$ is number of children of $i$-th member of $n$-th generation. Note also that $Z_i$ is branching process of its own, so it has the same generating function of branching process in $m$-th generation. Also, number of children of every member of $n$-th generation is independent of each other, therefore preceding sum is sum of independent random variables, whose number is governed by $X_n$. Therefore: $$G_{n+m}(s) = G_n(G_m(s))$$

Picking $n = 1$ and iterating we obtain desired equality.

You can obtain the mean by differentiating once, i.e. $G_n'(s) = \sum_{k \geq 0} ( s^k)' p^n_k = \sum_{k \geq 0} k s^{k-1} p^n_k \rightarrow \sum_{n \geq 0} k p^n_k$. Note that $G(1) = 1$ since $\EE 1^X_n = \EE 1 = 1$. Therefore $$G_n'(s) = ( G(G_{n-1}(s))' = G'( G_{n-1}(s) ) G_{n-1}'(s) \rightarrow G'(1) G_{n-1}'(1) = G_n'(1)$$

Your equality $(2)$ follows by repeated application. Note that this holds only in a limit of $s \rightarrow 1$

so $\EE X_n = \mu \EE X_{n-1}$.

As I understand from the previous question $F_n(s) = \underbrace{G \circ G \cdots G}_{n\ \text{times}}$. Therefore $G_n(0) = \sum_{k \geq 0} 0^k p^n_k = p^n_0$ since $0^0=1$, which should answer your question (assuming that it pertains to the last equality only, let me know if you have problems with others)