I was assigned to prove, given that the offspring distribution $X$ for a branching process has mean $\mu$ and variance $\sigma^2<\infty$, the following inequality: $q\leq1-\frac{\mu-1}{\sigma^2+\mu^2-\mu}$, where $q:=\mathbb{P}(\text{Branching process will go extinct})$. Using the properties of the generating function $f_X(s):=f(s)$, I know that I am able to rewrite this as
$$q=f(q)\leq1-\frac{f'(1)-1}{f''(1)}=f(1)-\frac{f'(1)-f(1)}{f''(1)}$$
This has resulted in absolutely nothing, so far. I have been wondering what the connection could possibly be that makes this problem make sense to me. I have tried to rewrite it in several different forms, to no avail. For example, perhaps the form $\frac{\mu-1}{f''(1)}\leq1-q=\mathbb{P(\text{Branching process goes on forever})}$ would offer more insight. I still didn't figure out how this could be connected to the second derivative of the generating function at $s=1$, though.
This is an assignment, so I'm not asking that anyone solve this problem for me, but I would appreciate any hint as to where I could start looking, as I have been stuck on this for a while now.
Edit: I found a proof for this inequality under the assumption that $1-q=\mathbb{P}(Z_t>0 \forall t>0)=\mathbb{P}(\lim_{t\to\infty}Z_t>0)$, where $Z_t$ is the population of generation $t$. However, I'm not $100$% sure that this is an enitrely "legal" equality. If anyone could confirm whether $\mathbb{P(\text{Branching process goes on forever})}=\mathbb{P}(\lim_{t\to\infty}Z_t>0)$, then that would solve my problem immediately.
2026-04-01 23:58:43.1775087923