Alternating Probability Generating Functions in Branching Processes

643 Views Asked by At

Define a stochastic process ($X_n)_{n≥0}$ so that $X_n$ represents the number of individuals in the population at time $n$. Each generation gives birth to a random number of offspring that forms the next generation where the offspring distribution alternates between consecutive generations.Hence, with $X_0 = 1$, then $X_{n+1}$ = $\sum_{k=1}^{X_{n}}$ = $Z_k^{n}$ where $Z_k^{n}$ ∼ {$Z_1$ if $n$ is odd, and $Z_2$ if $n$ is even.} where $Z_1, Z_2$ are RVs on $N$ such that $E(Z_1)$ = $µ_1 < ∞$ and $E(Z_2) = µ_2 < ∞$.

a)Define $F_n(s) = E_1[s^{X_n}]$ and $G_i(s) = E_i[s^{Z_i}]$.

Show that for $0 ≤ s < 1$: $F_{2n}(s) = G_2(G_1(F_{2n-2}(s)))$

b)Suppose that $Z_1 ∼ Po(5)$ and $Z_2 ∼ Po(µ_2)$. Give a range of values for $µ_2$ that guarantee extinction for the population.

For a) I have a proof for when there is only one PGF for $Z_k$ but not for two and I am unable to adapt this.

For b) I yet again know how to do this with one PGF, but am unsure how to use that here.

1

There are 1 best solutions below

0
On BEST ANSWER

1.)

Really this comes down to manipulating the tower property of conditional expectations. In substance this is the same thing you do with a 'regular' branching process, except this is an alternating process so you ultimately want a look the whole process 2 at a time. Since this is a recurrence that has $X_0$ we should stake a look at the beginning for a base case.
$X_{n+1}=\sum_{k=1}^{X_{n}}Z_k^{(n)}$
so
$X_{2}=\sum_{k=1}^{X_{1}}Z_k^{(1)}$ and
$X_{1}=\sum_{k=1}^{X_{0}}Z_k^{(0)} =Z_1^{(0)}$
note I assume all $Z_k^{(n)}$ are independent (and indeed iid for common n mod 2) as this is how branching processes (esp with this generating function) are structured. Technically this wasn't written in OP.

$F_{2n}(s) = E\Big[s^{X_{2n}}\Big]= E\Big[E\big[s^{X_{2n}}\big \vert X_{2n-1}\big]\Big] $
and we focus on the r.v.
$E\big[s^{X_{2n}}\big \vert X_{2n-1}\big] $
$= E\Big[s^{\sum_{k=1}^{X_{2n-1}}Z_k^{(2n-1)}}\big \vert X_{2n-1}\Big] $
$= E\Big[\prod_{k=1}^{X_{2n-1}} s^{Z_k^{(2n-1)}}\big \vert X_{2n-1}\Big] $
$= E\big[s^{Z_k^{(2n-1)}}\big]^{X_{2n-1}} $
i.e. considering $X_{2n-1}=r $ this reads
$ E\Big[\prod_{k=1}^{X_{2n-1}} s^{Z_k^{(2n-1)}}\big \vert X_{2n-1}=r\Big]= E\Big[\prod_{k=1}^{r} s^{Z_k^{(2n-1)}}\big \vert X_{2n-1}=r\Big] = E\Big[\prod_{k=1}^{r} s^{Z_k^{(2n-1)}}\Big] = E\Big[ s^{Z_k^{(2n-1)}}\Big]^r $
where for a fixed superscript, the $Z_k$ are iid. Now letting $t:=E\big[s^{Z_k^{(2n-1)}}\big] = G_1^{(2n-1)}(s)$

this reads
$F_{2n}(s) = E\Big[s^{X_{2n}}\Big]= E\Big[E\big[s^{X_{2n}}\big \vert X_{2n-1}\big]\Big]= E\Big[E\big[t^{X_{2n-1}}\big \vert X_{2n-1}\big]\Big]= E\Big[t^{X_{2n-1}}\Big]=E\Big[G_1^{(2n-1)}(s)^{X_{2n-1}}\Big] $

since $F_{0}(s)=s$, in terms of base case, consider $n=1$, we should get
$F_{2}(s) = G_2(G_1(F_{0}(s))) = G_2(G_1(s)) = G_2(E[s^{Z^{(1)}}]^{X_1})= E\big[E[s^{Z^{(1)}}]^{Z^{(0)}}\big]$
but there is an awful lot of symbol manipulation here with many different indices so there's a good chance of a bug or two.

To finish this off, re-run the argument for $F_{2n-1}(s)$ and induct backward

2.)
application of the chain rule tells you that
$\frac{d}{ds}F_{2n}(s) =G_2'(s)\cdot G_1'(s)\cdot \frac{d}{dx}F_{2n-2}(s)$
and to get the expected value evaluate at
$\frac{d}{ds}F_{2n}(s)_{\substack{s\uparrow 1}}$
(Abel's theorem in the background)
Considering the case of $n=1$ i.e. $2n=2$ gives you a base case value of $\mu_1\cdot \mu_2$ and you can guess the result and justify with induction from there.

Really the mean is enough to justify extinction probabilities for any $Z_i$ defined on the natural numbers so long as they have non-zero probability of being zero.

The Markov chain view (taken 2 generations at a time) is that you necessarily have a countable state transient chain since the chain is connected and state zero is absorbing. So all probabilities are either absorbed in state zero, or they run off to $\infty$. But if the mean of the process is $\leq 1$, you can apply Markov inequality (or better, but more work Kolmogorov upcrossing inequality) to show that the probabilities of $X_{2n}$ becoming arbitrarily large is arbitrarily small. A different view specializes to a mean $\lt 1$ and direct application of markov inequality pinches $Pr(X_{2n}\geq \delta) \leq \frac{(\mu_1\cdot \mu_2)^{2n}}{\delta}$ for any $\delta \gt 0$ which is arbitrarily small for large enough $n$. The case of mean = 1 being extinction WP1 then is implied by continuity of the generating function (and the fact that the extinction probability =1 is a closed set). There are several other ways to view this of course but the point is that the mean gives an awful lot of information in this case.