Let $X$ be a random variable taking values in $\mathbb{N}_{0}$, with probability-generating function $G(z) = \sum_{x \in \mathbb{N}_{0}} p(x) z^x$. Assume that $p(0) > 0$, $\mathbb{E}[X] = 1$ and $\operatorname{Var}[X]$ is finite. Why do these conditions imply that the integral $\int_{0}^{1} (1-z)(G(z)-z)^{-1} dz$ diverges to $+ \infty$? I do not see the intuition behind this integral, or its relation to the expectation and variance of $X$.
2026-04-01 03:58:21.1775015901
On
Probability-generating function and variance
161 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
3
There are 3 best solutions below
12
On
Since $G^\prime(1^-)=\mu=1$ and $G^{\prime\prime}(1^-)=\mu^2+\sigma^2-\mu=\sigma^2$,$$G(1-\epsilon)=G(1^-)-G^\prime(1^-)\epsilon+\frac12G^{\prime\prime}(1^-)\epsilon^2+o(\epsilon^2)=1-\epsilon+\frac12\sigma^2\epsilon^2+o(\epsilon^2)$$for $0<\epsilon\ll1$, i.e. $G(z)-z=\frac12\sigma^2(1-z)^2+o((1-z)^2)$ for $z$ slightly less than $1$. So at the upper limit, the integral diverges like $\frac{2}{\sigma^2}\int_c^1\frac{dz}{1-z}$. (By contrast, at the lower limit the integrand approximates $1$.)
HINT
Note that $\mathbb{E}[X] = \sum_x xp(x) = G'(1)$ and the condition that the variance is finite amounts to a finite second moment, in other words $G''(1) = \sum_x x^2 p(x) < \infty$. Can you take it from here?