A Markov Chain with states 0,1,... has transition probabilities $$p_{jk}=e^{-a} \sum_{r=0}^k \left( \begin{matrix} j \\ r \end{matrix} \right) p^r (1-p)^{j-r} a^{k-r} / (k-r)!$$ Show that the limiting distribution $\{ v_k\}$ is Poisson with parameter $a/(1-p)$.
My attempt was using:
$$ \sum_{k=0}^\infty p_{jk} = 1 $$ and
$$ v_k = \sum_{j=0}^\infty v_j p_{jk} $$
But I am really out of order on how to simplify
$$ v_k = \sum_{j=0}^\infty v_j e^{-a} \sum_{r=0}^k \left( \begin{matrix} j \\ r \end{matrix} \right) p^r (1-p)^{j-r} a^{k-r} / (k-r)! $$
The Markov chain, let's call it $Z=(Z_n)_{n\ge 0}$, looks to be a branching process with immigration. The offspring distribution is Bernoulli with mean value $p$ and the immigration distribution is Poisson with mean $a$. That is to say, $$ Z_{n+1} \stackrel{d}{=} Y_{n+1}+\sum_{k=1}^{Z_n}\xi_k^{(n)}, $$ where $Y_{n+1}$ is Poisson($a$), the $\xi_k^{(n)}$ are are Bernoulli($p$), and all the $Y_n$s and $\xi_k^{(n)}$s are independent.
If there is to be a stationary distribution for this chain, then a random variable $X$ with that distribution will satisfy the distributional identity $$ X \stackrel{d}{=} Y+\sum_{k=1}^X\xi_k,\qquad\qquad (1) $$ where on the right, $Y$, $X$, and the $\xi_k$s are independent and have the same distributions as their earlier discussed namesakes.
Using generating functions it's not hard to see that if $X$ is Poisson($a/(1-p)$) then (1) is true.
Because the branching process with Bernoulli($p$) offspring distribution is subcritical and $P(Y=0)>0$, the Markov chain $Z$ is recurrent. Because the associated invariant measure equation $v=v{\bf P}$ has a solution $v$ with finite total mass (namely, the Poisson($a/(1-p)$) distribution), the process $Z$ is in fact positive recurrent (and clearly aperiodic). It follows from Markov chain theory that $$ \lim_n P(Z_n=k) = e^{-\mu}{\mu^k\over k!},\qquad k=0,1,2,\ldots, $$ where $\mu=a/(1-p)$.