Derivative of cumulative generating function at zero equals expectation value

183 Views Asked by At

Let $X$ be a random variable with values in $\mathbb{N_0}$. Then we can define the cumulative generating function of $X$ via $$ F_{X}: (-\infty, 0] \rightarrow \mathbb{R} \quad \quad t \mapsto \log(\sum_{k=0}^{\infty} P(X=k) \exp(tk)) .$$

The infinite sum in the definition converges for every non-positive $t$ and, depending on the distribution of $X$, sometimes also for $t \in (0,a]$. Now, what I want to show is that $$F_{X}^\prime(0)=E(X) \quad \text{and} \quad F_{X}^{\prime \prime}(0) = Var(X). $$

If I just naively derivate $F_{X}$ by first derivating the $\log$ and then every term of the sum and evaluate at $t=0$, then I immediately get the wanted results. But of course one has to prove that differentiation and infinite summation are interchangeable here. To do this, I tried to show that the series of first and second derivatives $$ s_n := \sum_{k=0}^{\infty} P(X=k) \exp(tk)k \quad \quad \text{and} \quad \quad w_n:=\sum_{k=0}^{\infty} P(X=k) \exp(tk)k^2 $$ converge uniformly in some interval $I:=[-\delta,0]$. But so far I only managed to prove the uniform convergence for an arbitrary interval $[a,b]$ with $a<b<0$. Is this sufficient? If not, how can I show uniform convergence of $s_n$ and $w_n$ on $I$? Or has someone a completely different idea to compute the derivatives of $F_{X}$ at zero? Thanks for any idea!