Let $\{X_n:n\geqslant0\}$ be an irreducible aperiodic Markov chain. Let the arrival process be governed by a Poisson process $\{N(t):t\geqslant 0\}$ of intensity $\lambda$ and the service times i.i.d. with common distribution $F$, also independent of $\{N(t)\}$. Let $S_n$ denote the service times, then if $\rho:=\lambda\mathbb E[S]<1$, the Markov chain is positive recurrent and converges in distribution to $X_\infty$ where $$ \hat\pi(z) = \mathbb E[z^{X_\infty}] = \frac{(1-\rho)(z-1)\hat g(\lambda(1-z))}{z-\hat g(\lambda(1-z))} $$ and $$ \hat g(s) = \int_0^\infty e^{-st}\ \mathsf dF(t) $$ is the Laplace-Stieltjes transform of $F$.
The proof hinges on the recurrence \begin{align} X_{n+1} &= X_n+Y_{n-1},\quad X_n\geqslant 1\\ X_{n+1} &= Y_n,\quad X_n=0, \end{align} where for each nonnegative integer $j$, $$ a_j=\mathbb P(Y_n=j) = \int_0^\infty e^{-\lambda x}\frac{(\lambda x)^j}{j!}\ \mathsf dG(x), $$ and $Y_n$ being the number of arrivals during one service time. We see immediately that the transition probabilities of the Markov chain are \begin{align} P_{0j} &= a_j = \int_0^\infty e^{-\lambda x}\frac{(\lambda x)^j}{j!}\ \mathsf dG(x)\\ P_{ij} &= a_{j-i+1} = \int_0^\infty e^{-\lambda x}\frac{(\lambda x)^j}{(j-i+1)!}\ \mathsf dG(x)\\ P_{ij} &=0, \text{ otherwise}. \end{align} Let $\rho = \sum_{j=1}^\infty ja_j$, then from $\rho=\lambda\mathbb E[S]$ we have $$ \pi_1 = \pi_0a_1 + \sum_{i=1}^{j+1}\pi_i a_{j-i+1},\quad j\geqslant0. \tag1 $$ Let $\pi(s) = \sum_{j=0}^\infty \pi_js^j$ and $A(s) = \sum_{j=0}^\infty a_js^j$ be the generating functions of $\{\pi_j\}$ and $\{a_j\}$, respectively, then multiplying $(1)$ by $s^j$ and summing over $j$ yields \begin{align} \pi(s) &= \pi_0A(s) + \sum_{j=0}^\infty\sum_{i=0}^{j+1}\pi_i a_{j-i+1}s^j\\ &= \pi_0A(s) + s^{-1}\sum_{j=1}^\infty \pi_is^i\sum_{j=i-1}^\infty a_{j-i+1}s^{j-i+1}\\ &=\pi_0A(s) + s^{-1}\sum_{j=1}^\infty \pi_is^i\sum_{j=0}^\infty a_{j}s^j\\ &= \pi_0A(s) + (\pi(s)-\pi(0))A(s)/s, \end{align} from which we obtain $$ \pi(s) = \frac{(s-1)\pi_0A(s)}{s-A(s)}. $$ Then as $\lim_{s\uparrow 1}A(s) = \sum_{i=0}^\infty a_i =1$, this gives $$ \lim_{s\uparrow1}\pi(s)=\pi_0\lim_{s\uparrow1}\frac{s-1}{s-A(s)} = \pi_0(1-A'(1))^{-1}, $$ where we have used L'Hôpital's rule. Now $A'(1)=\sum_{i=0}^\infty ia_i=\rho$, and thus $\lim_{s\uparrow1}\pi(s) = \frac{\pi_0}{1-\rho}\implies \pi_0 = 1-\rho = 1-\lambda\mathbb E[S]$. But since $\lim_{s\uparrow1}\pi(s) = \sum_{i=0}^\infty \pi_i$, we have $\sum_{i=0}^\infty \pi_i = \frac{\pi_0}{1-\rho}$. Hence when $\rho<1$, $$ \pi(s) = \frac{(1-\lambda\mathbb E[S])(s-1)A(s)}{s-A(s)}.\tag 2 $$
My question: How do we get from $(2)$ to $$ \mathbb E[X_\infty] = \rho + \frac{\rho^2}{1-\rho}\cdot\frac{c_s^2+1}2, $$ where $c_s^2 = \frac{\mathrm{Var}(S)}{\mathbb E[S^2]}$ is the squared coefficient. The hint is to apply L'Hôpital's rule twice to find the derivative $\lim_{s\uparrow 1}\pi'(s)$, but I don't see it. How can I compute this limit?