Does $ \sum_{k=1}^{n} \frac{(n-k)^k}{k!} $ have a closed-form expression in terms of $n \in \mathbb{N}$? It seems to grow a bit faster than $e^{0.5n}$, but there's clearly more to it, and I don't know how to look this up.
I know that:
$ \sum_{k=1}^{\infty} \frac{z^k}{k!} = e^z - 1$. I'm looking for a similar closed-form expression that eliminates the $k$.
$ \sum_{k=1}^{n} \frac{n^k}{k!} = e^n \frac{\Gamma(n+1,n)}{\Gamma(n)} - 1$, where the $\Gamma$ ratio is close to 1/2.
$ \sum_{k=1}^{\infty} \frac{(n-k)^k}{k!} $ diverges. (It makes no sense for my purpose anyway, once the $n-k$ goes negative and the exponents flip its sign.)
Thanks!
We can obtain asymptotic approximations for large $n$ of this sum.
First approach. We approximate the sum by an integral and then approximate the gamma function by Stirling's formula. Thus, \begin{align*} \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} &\sim \int_0^n {\frac{{(n - t)^t }}{{\Gamma (t + 1)}}\mathrm{d}t} = n\int_0^1 {n^{ns} \frac{{(1 - s)^{ns} }}{{\Gamma (ns + 1)}}\mathrm{d}s} \\ & \sim \sqrt {\frac{n}{{2\pi }}} \int_0^1 {s^{ - 1/2} \exp \left( { - n\left( {s\log \left( {\frac{s}{{1 - s}}} \right) - s} \right)} \right)\mathrm{d}s} \end{align*} as $n\to +\infty$. The integral may be evaluated asymptotically via the saddle point method: \begin{align*} \int_0^1 {s^{ - 1/2} \exp \left( { - n\left( {s\log \left( {\frac{s}{{1 - s}}} \right) - s} \right)} \right)\mathrm{d}s} & \sim (1 - s_0 )\sqrt {\frac{{2\pi }}{n}} \mathrm{e}^{ - n\left( {s_0 \log \left( {\frac{{s_0 }}{{1 - s_0 }}} \right) - s_0 } \right)} \\ & = \frac{1}{{1 + \Omega }}\sqrt {\frac{{2\pi }}{n}} \mathrm{e}^{\Omega n} \end{align*} as $n\to +\infty$. Here $s_0 = \frac{\Omega }{{1 + \Omega }} = 0.361896 \ldots$ is the sole saddle point on the path of integration and $\Omega=0.5671432\ldots$ satisfies $\Omega \mathrm{e}^\Omega =1$. Therefore $$ \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} \sim \frac{\mathrm{e}^{\Omega n} }{{1 + \Omega }} $$ as $n\to +\infty$. It seems that this approximation is extremely good. For example with $n=10$, the sum is $185.3375027557\ldots$, whereas the approximation yields $185.3375027898\ldots$. The alternative approach below shows the reason behind this accuracy.
Second approach. A different approach using generating functions yields a complete asymptotic expansion. We start with the following power series involving the principal branch of the Lambert $W$-function: $$ \exp (nW_0(z)) = \sum\limits_{j = 0}^\infty {\frac{{n(n - j)^{j - 1} }}{{j!}}z^j } , $$ valid for $|z|<\frac{1}{\mathrm{e}}$. Differentiating both sides gives $$ \frac{{W_0(z)}}{{1 + W_0(z)}}\exp (nW_0(z)) = \sum\limits_{j = 0}^\infty {\frac{{j(n - j)^{j - 1} }}{{j!}}z^j } . $$ Taking the difference of the two expansions, we find $$ \frac{{\exp (nW_0(z))}}{{1 + W_0(z)}} = \sum\limits_{j = 0}^\infty {\frac{{(n - j)^j }}{{j!}}z^j } , $$ and hence \begin{align*} \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} & = [z^n ]\frac{{\exp (nW_0(z))}}{{(1 + W_0(z))(1 - z)}} = \frac{1}{{2\pi \mathrm{i}}}\oint_{(0 + )} {\frac{{\exp (nW_0 (z))\mathrm{d}z}}{{(1 + W_0 (z))(1 - z)z^{n + 1} }}} \\ & = \frac{1}{{2\pi \mathrm{i}}}\oint_{(0 + )} {\frac{{\mathrm{d}z}}{{W_0^n (z)(1 + W_0 (z))z(1 - z)}}} = \frac{1}{{2\pi \mathrm{i}}}\oint_{(0 + )} {\frac{{\mathrm{d}t}}{{t^{n + 1} (1 - t\mathrm{e}^t )}}} . \end{align*} Accordingly, $$ \frac{1}{{1 - z\mathrm{e}^z }} = \sum\limits_{n = 0}^\infty {\left( {\sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} } \right)z^n } , $$ provided $|z|<\Omega$. If $W_k$ denotes the $k$th branch of the Lambert $W$-function, then $$ \frac{1}{{1 - z\mathrm{e}^z }} - \sum\limits_{k = - \infty }^{ \infty } {\frac{1}{{1 + W_k (1)}}\frac{1}{{1 - z\mathrm{e}^{W_k (1)} }}} $$ is an entire function of $z$ (note that $W_{ \pm k} (1) = - \log (2\pi k) \pm (2k - \frac{1}{2})\pi \mathrm{i} + o(1)$ for $k\ge 1$). Thus $$ \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} \sim \sum\limits_{k = - \infty }^{ \infty } {\frac{{\mathrm{e}^{W_k (1)n} }}{{1 + W_k (1)}}} = \frac{{\mathrm{e}^{\Omega n} }}{{1 + \Omega }} + 2 \sum\limits_{k = 1}^{ \infty } \operatorname{Re}\frac{\mathrm{e}^{W_k (1)n} }{1 + W_k (1)} , $$ as $n\to +\infty$. In particular, $$ \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} = \frac{{\mathrm{e}^{\Omega n} }}{{1 + \Omega }} + \mathcal{O}(\mathrm{e}^{n\operatorname{Re} W_1 (1)} ) = \frac{{\mathrm{e}^{\Omega n} }}{{1 + \Omega }} + \mathcal{O}(\mathrm{e}^{ - 1.5339133 \ldots \times n} ), $$ showing why the leading order asymptotics is so accurate.