Does $ \sum_{k=1}^{n} \frac{(n-k)^k}{k!} $ have a closed-form expression in terms of $n \in \mathbb{N}$?

259 Views Asked by At

Does $ \sum_{k=1}^{n} \frac{(n-k)^k}{k!} $ have a closed-form expression in terms of $n \in \mathbb{N}$? It seems to grow a bit faster than $e^{0.5n}$, but there's clearly more to it, and I don't know how to look this up.

I know that:

$ \sum_{k=1}^{\infty} \frac{z^k}{k!} = e^z - 1$. I'm looking for a similar closed-form expression that eliminates the $k$.

$ \sum_{k=1}^{n} \frac{n^k}{k!} = e^n \frac{\Gamma(n+1,n)}{\Gamma(n)} - 1$, where the $\Gamma$ ratio is close to 1/2.

$ \sum_{k=1}^{\infty} \frac{(n-k)^k}{k!} $ diverges. (It makes no sense for my purpose anyway, once the $n-k$ goes negative and the exponents flip its sign.)

Thanks!

3

There are 3 best solutions below

2
On BEST ANSWER

We can obtain asymptotic approximations for large $n$ of this sum.

First approach. We approximate the sum by an integral and then approximate the gamma function by Stirling's formula. Thus, \begin{align*} \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} &\sim \int_0^n {\frac{{(n - t)^t }}{{\Gamma (t + 1)}}\mathrm{d}t} = n\int_0^1 {n^{ns} \frac{{(1 - s)^{ns} }}{{\Gamma (ns + 1)}}\mathrm{d}s} \\ & \sim \sqrt {\frac{n}{{2\pi }}} \int_0^1 {s^{ - 1/2} \exp \left( { - n\left( {s\log \left( {\frac{s}{{1 - s}}} \right) - s} \right)} \right)\mathrm{d}s} \end{align*} as $n\to +\infty$. The integral may be evaluated asymptotically via the saddle point method: \begin{align*} \int_0^1 {s^{ - 1/2} \exp \left( { - n\left( {s\log \left( {\frac{s}{{1 - s}}} \right) - s} \right)} \right)\mathrm{d}s} & \sim (1 - s_0 )\sqrt {\frac{{2\pi }}{n}} \mathrm{e}^{ - n\left( {s_0 \log \left( {\frac{{s_0 }}{{1 - s_0 }}} \right) - s_0 } \right)} \\ & = \frac{1}{{1 + \Omega }}\sqrt {\frac{{2\pi }}{n}} \mathrm{e}^{\Omega n} \end{align*} as $n\to +\infty$. Here $s_0 = \frac{\Omega }{{1 + \Omega }} = 0.361896 \ldots$ is the sole saddle point on the path of integration and $\Omega=0.5671432\ldots$ satisfies $\Omega \mathrm{e}^\Omega =1$. Therefore $$ \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} \sim \frac{\mathrm{e}^{\Omega n} }{{1 + \Omega }} $$ as $n\to +\infty$. It seems that this approximation is extremely good. For example with $n=10$, the sum is $185.3375027557\ldots$, whereas the approximation yields $185.3375027898\ldots$. The alternative approach below shows the reason behind this accuracy.

Second approach. A different approach using generating functions yields a complete asymptotic expansion. We start with the following power series involving the principal branch of the Lambert $W$-function: $$ \exp (nW_0(z)) = \sum\limits_{j = 0}^\infty {\frac{{n(n - j)^{j - 1} }}{{j!}}z^j } , $$ valid for $|z|<\frac{1}{\mathrm{e}}$. Differentiating both sides gives $$ \frac{{W_0(z)}}{{1 + W_0(z)}}\exp (nW_0(z)) = \sum\limits_{j = 0}^\infty {\frac{{j(n - j)^{j - 1} }}{{j!}}z^j } . $$ Taking the difference of the two expansions, we find $$ \frac{{\exp (nW_0(z))}}{{1 + W_0(z)}} = \sum\limits_{j = 0}^\infty {\frac{{(n - j)^j }}{{j!}}z^j } , $$ and hence \begin{align*} \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} & = [z^n ]\frac{{\exp (nW_0(z))}}{{(1 + W_0(z))(1 - z)}} = \frac{1}{{2\pi \mathrm{i}}}\oint_{(0 + )} {\frac{{\exp (nW_0 (z))\mathrm{d}z}}{{(1 + W_0 (z))(1 - z)z^{n + 1} }}} \\ & = \frac{1}{{2\pi \mathrm{i}}}\oint_{(0 + )} {\frac{{\mathrm{d}z}}{{W_0^n (z)(1 + W_0 (z))z(1 - z)}}} = \frac{1}{{2\pi \mathrm{i}}}\oint_{(0 + )} {\frac{{\mathrm{d}t}}{{t^{n + 1} (1 - t\mathrm{e}^t )}}} . \end{align*} Accordingly, $$ \frac{1}{{1 - z\mathrm{e}^z }} = \sum\limits_{n = 0}^\infty {\left( {\sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} } \right)z^n } , $$ provided $|z|<\Omega$. If $W_k$ denotes the $k$th branch of the Lambert $W$-function, then $$ \frac{1}{{1 - z\mathrm{e}^z }} - \sum\limits_{k = - \infty }^{ \infty } {\frac{1}{{1 + W_k (1)}}\frac{1}{{1 - z\mathrm{e}^{W_k (1)} }}} $$ is an entire function of $z$ (note that $W_{ \pm k} (1) = - \log (2\pi k) \pm (2k - \frac{1}{2})\pi \mathrm{i} + o(1)$ for $k\ge 1$). Thus $$ \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} \sim \sum\limits_{k = - \infty }^{ \infty } {\frac{{\mathrm{e}^{W_k (1)n} }}{{1 + W_k (1)}}} = \frac{{\mathrm{e}^{\Omega n} }}{{1 + \Omega }} + 2 \sum\limits_{k = 1}^{ \infty } \operatorname{Re}\frac{\mathrm{e}^{W_k (1)n} }{1 + W_k (1)} , $$ as $n\to +\infty$. In particular, $$ \sum\limits_{k = 0}^n {\frac{{(n - k)^k }}{{k!}}} = \frac{{\mathrm{e}^{\Omega n} }}{{1 + \Omega }} + \mathcal{O}(\mathrm{e}^{n\operatorname{Re} W_1 (1)} ) = \frac{{\mathrm{e}^{\Omega n} }}{{1 + \Omega }} + \mathcal{O}(\mathrm{e}^{ - 1.5339133 \ldots \times n} ), $$ showing why the leading order asymptotics is so accurate.

2
On

$$S_n=\sum_{k=1}^{n} \frac{(n-k)^k}{k!}$$ made me thinking about Lambert function.

A quick and dirty linear regression with high accuracy $$\log(S_n)=a+b\,n$$ shows $b=W(1)=\Omega$.

Looking at $\big[\log(S_n)-\Omega\,n\Big]$ shows an asymptotic value equal to $a=-0.44925440\cdots$ which is not identified by inverse symbolic calculators.

However $e^a=0.638103743365111$ is extremely close to the largest root of the quadratic $9280 x^2 - 6351 x + 274=0$ $$\frac{6351+\sqrt{30164321}}{18560}=0.638103743365052$$

Computing a few numbers for $$T_n=\frac{6351+\sqrt{30164321}}{18560}\,e^{\Omega\,n}$$

$$\left( \begin{array}{ccc} n & T_n & S_n \\ 10 & 1.85338\times 10^{02} & 1.84338 \times 10^{02} \\ 20 & 5.38314\times 10^{04} & 5.38304 \times 10^{04}\\ 30 & 1.56353\times 10^{07} & 1.56353\times 10^{07} \\ 40 & 4.54129\times 10^{09} & 4.54129\times 10^{09} \\ 50 & 1.31902\times 10^{12} & 1.31902\times 10^{12} \\ 60 & 3.83110\times 10^{14} & 3.83110\times 10^{14} \\ 70 & 1.11274\times 10^{17} & 1.11274\times 10^{17} \\ 80 & 3.23197\times 10^{19} & 3.23197\times 10^{19} \\ 90 & 9.38727\times 10^{21} & 9.38727\times 10^{21} \\ 100 & 2.72654\times 10^{24} & 2.72654\times 10^{24} \\ 200 & 1.16502\times 10^{49} & 1.16502\times 10^{49} \\ 300 & 4.97796\times 10^{73} & 4.97796\times 10^{73} \\ 400 & 2.12702\times 10^{98} & 2.12702\times 10^{98} \end{array} \right)$$

0
On

By Lagrange's inversion theorem $$ e^{-r W_0(x)} = \sum_{n\geq 0}\frac{r(n+r)^{n-1}}{n!}(-x)^n $$ holds for any $r\in\mathbb{C}$ and $|x|<\frac{1}{e}$. By replacing $x$ with $z e^z$ we get $$ e^{-r z} = \sum_{k\geq 0}\frac{r (k+r)^{k-1}}{k!}(-ze^z)^k $$ and by picking $r$ as $-n$ $$ \exp(n z) = \sum_{k\geq 0}\frac{n(n-k)^{k-1}}{k!}(z e^z)^k $$ allowing to deduce Gary's bound from the dominated convergence theorem (the tail $\sum_{k>n}$ decays pretty fast, also due to the alternating signs).