On the wikipedia page for Gamma function I saw an interesting formula $$ \lim_{n\to \infty} \frac{\Gamma(n+\alpha)}{\Gamma(n)n^\alpha} = 1 $$ for all $\alpha\in\Bbb C$. I couldn't find the source of this and searching here in MSE didn't provide the result I want.
Could anyone show me how this formula is derived?
I'm very inexperienced with properties/identities of $\Gamma$ so forgive me if this question is trivial.
The most usual derivation of this would involve the Stirling-Laplace asymptotic for $\Gamma(s)$. I'm mildly surprised that this wasn't explicitly worked out in Wiki, or some other easily accessible places on-line.
In fact, a much simpler approach obtains (a stronger version of) this asymptotic via "Watson's Lemma", which is itself easy to completely prove from simple things, going back over 100 years. In various places in the literature, the lemma is in fact called "the oft-reproven Watson's lemma". :)
The case you mention is a simple corollary of the very first example I wrote out in some notes on asymptotic expansions: http://www.math.umn.edu/~garrett/m/mfms/notes_2013-14/02d_asymptotics_of_integrals.pdf