Let $a, \lambda, \mu$ be real numbers with $a > 0$ and $\lambda > \mu$ (in attempt to make my life easier, I'm assuming that $\lambda$ and $\mu$ are positive, as well).
I am trying to show that
$$\left[\sum_{k=0}^{\infty}\left(\frac{\lambda}{\mu}\right)^k\right]^{a/\lambda} = \sum_{k=0}^{\infty}\left(\frac{\lambda}{\mu}\right)^k\frac{(a/\lambda)(1 + (a/\lambda))(2 + (a/\lambda)) \cdots ((k-1) + (a/\lambda))}{k!}. $$
(For those interested, this is from part of problem that deals with Poisson processes.)
Clearly, $\sum_{k=0}^{\infty}\left(\frac{\lambda}{\mu}\right)^k < 0$, since $\lambda < \mu$. Also, this equality clearly holds when $a = \lambda$. So, I've tried breaking it into the cases in which $a < \lambda$ and $a > \lambda$.
I've tried using falling/rising factorial representations of the sequence $$\frac{(a/\lambda)(1 + (a/\lambda))(2 + (a/\lambda)) \cdots ((k-1) + (a/\lambda))}{k!},$$ and also (under the assumption that $\lambda > 0$) the fact that $$\frac{(a/\lambda)(1 + (a/\lambda))(2 + (a/\lambda)) \cdots ((k-1) + (a/\lambda))}{k!} = \frac{\Gamma((a/\lambda) + k)}{\Gamma(a/\lambda)k!},$$ but nothing's popping out to me.
I assume there is some "obvious" "trick" here that I'm just not seeing...
To make things simpler, let $\lambda/\mu = x$ and $a/\lambda = y$. Left side is $$ (1-x)^{-y}$$ Now look up the binomial series. See the first "special case".