Evaluating a binomial sum involving $1/m^m$

315 Views Asked by At

I was wondering whether there is a closed form (or asymptotic expression) for the following binomial sum:

$$\sum_{m=0}^n \frac{1}{m^m} \binom{n}{m},$$

where we use the convention $0^0=1$. I feel like during my university studies we were taught a trick to solve problems like this by playing around with the binomial theorem, exponentials and sometimes calculus. However, nothing I've tried so far seems to work.

At the very least, a non-trivial upper bound would be handy.

Edit: Thanks all for the answers, they're very helpful and it's so fun to see all the different approaches

4

There are 4 best solutions below

4
On BEST ANSWER

I'm not aware of any technique for working with expressions like $\frac{1}{m^m}$ in sums exactly. If you had written $\frac{1}{n^m}$ then we'd just get $\left( 1 + \frac{1}{n} \right)^n$ by the binomial theorem, and if you had written $\frac{1}{m^n}$ maybe some other tricks would be available.

Asymptotics are easier. Generally speaking we can bound this sum from below by its largest term and above by $n$ times its largest term. Using the upper bound ${n \choose m} \le \frac{n^m}{m!}$, then using Stirling's approximation in the form $m! \ge \left( \frac{m}{e} \right)^m$, we get the very useful bound

$${n \choose m} \le \left( \frac{en}{m} \right)^m$$

which is close to tight when $m$ is small compared to $n$, and which gives

$$\frac{1}{m^m} {n \choose m} \le \left( \frac{en}{m^2} \right)^m.$$

This expression has logarithm $m \log \frac{en}{m^2} = m \log n + m - 2m \log m + m$ which has derivative $\log n + 1 - 2 \log m - 2$ as a function of $m$ and hence which is maximized exactly when $m = \sqrt{ \frac{n}{e} }$; this is also the approximate location of the true largest term. This means the value of $\left( \frac{en}{m^2} \right)^m$ at $m = \sqrt{ \frac{n}{e} }$ is an upper bound on the largest term and hence on every term, so substituting we get

$$\boxed{ \sum_{m=0}^n \frac{1}{m^m} {n \choose m} \le n \exp \left( 2 \sqrt{ \frac{n}{e} } \right) }.$$

We have $\frac{2}{\sqrt{e}} \approx 1.213 \dots $ so this is in good agreement with Claude's regression. We should also be able to get a lower bound that looks something like $\exp \left( 2 \sqrt{ \frac{n}{e} } \right)$ by lower bounding the largest term but the details seem slightly annoying. We can also improve the factor of $n$ by noting that for $m \ge \sqrt{en}$ we have $\left( \frac{ne}{m^2} \right)^m \le 1$, which improves the bound to

$$\sum_{m=0}^n \frac{1}{m^m} {n \choose m} \le \sqrt{en} \exp \left( 2 \sqrt{ \frac{n}{e} } \right) + n.$$

Working a bit more carefully we get a bound of $(1 + o(1)) \sqrt{ \frac{n}{e} } \exp \left( 2 \sqrt{ \frac{n}{e}} \right)$ and to do better than that we'd have to bound the $m < \sqrt{ \frac{n}{e} }$ terms more carefully.

6
On

Let $$S_n=\sum_{m=0}^n \frac{1}{m^m} \binom{n}{m}=\frac {a_n}{b_n}$$ where the $a_n$ form the unknown sequence $$\{1,2,13,517,45979,192028787,6736704119,7055409566468597,578095098313662358187\}$$ and the $b_n$ correspond to sequence $A107048$ in $OEIS$.

I do not think that a closed form exist.

For $n \leq 500$, a quick and dirty regression $$\log(S_n)=-a+b\,\sqrt n$$ gives $(R^2=0.9999957)$

$$\begin{array}{llll} \text{} & \text{Estimate} & \text{Std Error} & \text{Confidence Interval} \\ a & 0.6706424 & 0.0052274 & \{0.6603718,0.6809129\} \\ b & 1.2193423 & 0.0003200 & \{1.2186940,1.2199906\} \\ \end{array}$$

Thanks to the $ISC$, to make things nicer looking, use $$a \sim \frac{13\ 10^{3/4}-11^{3/4}}{100} \qquad \text{and} \qquad b\sim\left(\frac{23}{10}\right)^{5/21}$$

5
On

Another approach to large $n$ asymptotics.

Use the identity $$ m^{-m}=\frac 1{(m-1)!}\int_0^1(-\phi(x))^{m-1}dx,\qquad \phi(x):=x\log(x) $$ to rewrite $$ a(n):=\sum_{m=1}^n\binom nm m^{-m} = \int_0^1\sum_{m=1}^n\binom nm \frac{(-\phi(x))^{m-1}}{(m-1)!}dx. $$ It turns out that $$ \sum_{m=1}^n\binom nm \frac{(-t)^{m-1}}{(m-1)!}=L^{(1)}_{n-1}(t) $$ is a Laguerre polynomial. We have the following asymptotics for large $n$ (e.g., see the Wikipedia page about Laguerre polynomials) $$ L^{(1)}_{n-1}(-t)\sim\frac{n^{1/4}}{2\sqrt\pi\,t^{3/4}}\exp(-\frac t2+2\sqrt{tn}),\quad n\to+\infty,\ t>0. $$ Plug it into the expression for $a(n)$ $$ a(n)\sim \frac{n^{1/4}}{2\sqrt\pi}\int_0^1\frac{\exp(\phi(x)/2)}{(-\phi(x))^{3/4}}\exp(2\sqrt{-n\phi(x)})dx,\quad n\to+\infty. $$ Finally, this integral is amenable to Laplace's method, yielding $$ a(n)\sim\frac{n^{1/4}}{2\sqrt\pi}\frac{\exp(\phi(1/e)/2)}{(-\phi(1/e))^{3/4}}e^{2\sqrt{-n\phi(1/e)}}\sqrt{\frac{2\pi}{2\sqrt n \bigg|\frac{d^2\sqrt\phi}{dx^2}\big|_{x=1/e}\bigg|}},\quad n\to+\infty, $$ and so, after some algebra, $$ \boxed{a(n)\sim\frac 1{\sqrt 2}e^{2\sqrt {\frac ne}-\frac{1}{2 e}},\quad n\to+\infty } $$

(There are a few details missing to get a complete proof, but the final asymptotics seems to be numerically correct.)

0
On

I am going to derive a sharp upper bound. I shall use the inequality $$ \Gamma (m + 1/2) < m^m \mathrm{e}^{ - m} \sqrt {2\pi } $$ valid for all $m\geq 0$ (follows from, e.g., this paper). Using this inequality and the Legendre duplication formula, we find \begin{align*} \sum\limits_{m = 0}^n {\frac{1}{{m^m }}\binom{n}{m}} & \le \sum\limits_{m = 0}^n {\frac{1}{{m^m }}\frac{{n^m }}{{m!}}} = \sqrt {2\pi } \sum\limits_{m = 0}^n {\frac{1}{{m^m \mathrm{e}^{ - m} \sqrt {2\pi } }}\frac{{(n/\mathrm{e})^m }}{{m!}}} \\ & \le \sqrt {2\pi } \sum\limits_{m = 0}^n {\frac{{(n/\mathrm{e})^m }}{{m!\Gamma (m + 1/2)}}} \le \sqrt {2\pi } \sum\limits_{m = 0}^\infty {\frac{{(n/\mathrm{e})^m }}{{m!\Gamma (m + 1/2)}}} \\ & = \sqrt 2 \sum\limits_{m = 0}^\infty {\frac{{(2\sqrt {n/\mathrm{e}} )^{2m} }}{{(2m)!}}} = \sqrt 2 \cosh (2\sqrt {n/\mathrm{e}} ). \end{align*} Thus, $$\boxed{ \sum\limits_{m = 0}^n {\frac{1}{{m^m }}\binom{n}{m}} \le \sqrt 2 \cosh (2\sqrt {n/\mathrm{e}} ) ,\quad n\geq 0.} $$

Note that this upper bound is asymptotic to $\frac{1}{{\sqrt 2 }}\exp (2\sqrt {n/\mathrm{e}} )$ and thus, by the result of Giulio, is asymptotically off by a factor of $\mathrm{e}^{1/(2\mathrm{e})} = 1.201943368 \ldots\,$.