Intuitive explanation for $\lim\limits_{n\to\infty}\left(\frac{\left(n!\right)^{2}}{\left(n-x\right)!\left(n+x\right)!}\right)^{n}=e^{-x^2}$

198 Views Asked by At

In this post I noticed (at first numerically) that:

$$\lim\limits_{n\to\infty}\left(\frac{\left(n!\right)^{2}}{\left(n-x\right)!\left(n+x\right)!}\right)^{n}=e^{-x^2}$$

This can be proved by looking at the Taylor expansion

$$n\ln\left(\frac{\left(n!\right)^{2}}{\left(n-x\right)!\left(n+x\right)!}\right)=-2n\sum_{k=1}^{\infty}\frac{\psi^{(2k-1)}\left(n+1\right)}{\left(2k\right)!}x^{2k}$$

and the asymptotic expansion

$$\psi^{(m)}(n+1)=\left(-1\right)^{\left(m+1\right)}\sum_{k=0}^{\infty}\frac{\left(k+m-1\right)!}{k!}\frac{B_k}{n^{k+m}}$$

where we have chosen $B_1=-\frac12$.

However, this limit seems so beautiful and interesting that it produces the Gaussian function. It makes me wonder if there is a more intuitive way to understand this limit, possibly in a context of probabilities.

4

There are 4 best solutions below

0
On BEST ANSWER

Consider first

$$ \begin{align} \frac{(n+5)!}{n!} &=(n+5)(n+4)\cdots (n+1)\\ &= n^5 (1+5/n)(1+4/n) \cdots (1+1/n) \\ & \approx n^5 \left(1 + \frac{5+4+\cdots+1}{n}\right) \tag1\\ & = n^5 \left(1 + \frac{5\times6}{2n}\right) \\ \end{align}$$

Or, in general

$$ \frac{(n+x)!}{n!} \approx n^x \left(1 + \frac{x(x+1)}{2n}\right) \tag2$$

Similarly: $$ \frac{n!}{(n-x)!} \approx n^x \left(1 - \frac{x(x-1)}{2n}\right) \tag3$$

Then

$$ \begin{align} \frac{\left(n!\right)^{2}}{\left(n-x\right)!\left(n+x\right)!} &\approx \frac{\left(1 - \frac{x(x-1)}{2n}\right)}{\left(1 + \frac{x(x+1)}{2n}\right)}\\ &\approx \left(1 - \frac{x(x-1)}{2n}\right) \left(1 - \frac{x(x+1)}{2n}\right)\\ &\approx \left(1 - \frac{x^2}{n}\right) \tag4 \end{align} $$

And, using the standard exponential limit:

$$\lim_{n\to \infty} \left(1 - \frac{x^2}{n}\right)^n =e^{-x^2} \tag5$$

Edit Result $(4)$ can also be obtained by a probabilistic reasoning: Imagine the following scenario: from a bag with $n$ white balls and $x$ black balls we pick $x$ balls. Which is the probability that all picked balls are white?

This is $$ \frac{\binom{n}{x}}{\binom{n+x}{x}}=\frac{\left(n!\right)^{2}}{\left(n-x\right)!\left(n+x\right)!}$$

precisely the LHS in eq $(4)$.

Now, if $n \gg x$, the probability that more than one of the picked balls are black is negligible, hence we can approximate this by $P \approx 1 - x \frac{x}{n}= 1- x^2/n$

BTW: The complete term in the original limit, $\left(\frac{\left(n!\right)^{2}}{\left(n-x\right)!\left(n+x\right)!}\right)^{n}$, can then be regarded as the probability of getting no black balls (after adding $n$ white balls) in $n$ tries. I don't see how this probability could be associated with a gaussian density.

0
On

Here is my approach. Maybe not the best way since it's Weierstrass product of Gamma function. Anyway, we have: $$\Gamma (z+1) = e^{-\gamma z} \prod_{k=1}^{\infty} \left(1+\frac{z}{n}\right)^{-1} e^{z/n}\Rightarrow \log \Gamma (z+1)=- \gamma z + \sum_{k=1}^{\infty}\left[\frac{z}{k}- \ln \left(1+\frac{z}{k}\right)\right] $$ Now let $a_n:= n\left[2\log \Gamma (n+1) - \log \Gamma (n+1-x) - \log \Gamma (n+1+x)\right]$. Which we need to prove the fact that $\lim_{n \to \infty} e^{a_n}= e^{-x^2}$ or prove $ \lim_{n \to \infty} a_n=-x^2$. Doing some algebraic we obtain: $$a_n = n \cdot \sum_{k=1}^{\infty}\left[\ln\left(1 + \frac{n+x}{k}\right)+\ln\left(1 + \frac{n-x}{k}\right) - 2\ln\left(1 + \frac{n}{k}\right)\right]$$$$=n \cdot \sum_{k=1}^{\infty} \ln\left(1- \frac{x^2}{(n+k)^2}\right)=-~x^2 \cdot \sum_{k=1}^{\infty} \frac{n}{(n+k)^2}+\mathcal{O}(x^4)\cdot \sum_{k=1}^{\infty} \frac{n}{(n+k)^4} $$ It is obvious by comparison test that $\cdot \sum_{k=1}^{\infty} \frac{n}{(n+k)^4}\rightarrow 0, n \to \infty $. For the first sum, by Riemann's sum we obtain: $$\lim_{n \to \infty}\sum_{k=1}^{\infty} \frac{n}{(n+k)^2} = \lim_{n \to \infty}\frac{1}{n} \cdot \sum_{k=1}^{\infty} \frac{1}{(\frac{k}{n}+1)^2} = \int_{0}^{\infty} \frac{1}{(x+1)^2}\mathrm{d}x = 1$$ This result yields that $\lim_{n \to \infty} a_n = - x^2$. $\blacksquare$

2
On

Not sure if I have committed some fallacy below...

Taylor's theorem: $\log(1-z) = -(1-h(z)) z$ where $\lim_{z \to 0} h(z) =0$.

\begin{align} \lim_{n \to \infty} n \log \frac{(n!)^2}{(n-x)!(n+x)!} &= \lim_{n \to \infty} n \sum_{k=1}^{x} \log \frac{n-x+k}{n+k} \\ &= \lim_{n \to \infty}n \sum_{k=1}^x \log\left(1-\frac{x}{n+k}\right) \\ &= - \lim_{n \to \infty} n \sum_{k=1}^x (1 - h(x/(n+k))\frac{x}{n+k} \\ &= - x \sum_{k=1}^x \lim_{n \to \infty}\frac{n}{n+k} \\ &= -x^2 \end{align}

Edit:

If I haven't made some error, this seems to be the simplest approach that avoids hand-wavy approximations (the only approximation comes from the application of Taylor's theorem). No need for Stirling's approximation or gamma functions. However, as Mourad points out, this proof relies on $x$ being an integer (I assumed this since OP stated the question using factorials instead of the gamma function, and because OP was looking for a probabilistic/combinatorial interpretation), so the general setting would require some approximation of the gamma function, in which case the other answers would be better-suited.

Regarding a probabilistic explanation, I think leonbloy's edit has the simplest interpretation, although (as with the main part of his answer) I think some care must be taken to ensure that the approximations to $\frac{(n!)^2}{(n-x)!(n+x)!}$ do not blow up when raising everything to the $n$th power. The advantage of taking the limit of the logarithm as I have done here is seeing exactly how fine of an approximation I need on the main term $\log \frac{(n!)}{(n-x)!(n+x)!}$ in order to not blow up when the factor of $n$ enters from the outside.

0
On

This is not meant to be a fully rigorous argument, but gives some idea of where the $e^{-x^2}$ is coming from:

Start with $2n^2$ coins, and group them into $n$ groups of $2n$. Flip them all. The probability that, within each group, you get exactly $n$ heads, is $$2^{-2n^2} \left( \frac{(2n)!}{n! \ n!} \right)^n.$$ The probability that, in each group, you get precisely $n+x$ heads, is $$2^{-2n^2} \left( \frac{(2n)!}{(n+x)!(n-x)!} \right)^n.$$ The ratio of the two is $$\left( \frac{n! \ n!}{(n+x)! (n-x)!} \right)^n.$$

What I want to do is use the central limit theorem. The trouble is that the versions of the CLT that I know work in a fixed number of dimensions, not a growing number. But here is a rough approach which I hope makes your answer plausible.

Let's collapse the boundaries between the groups and just ask about getting $n^2$ total heads, versus $n(n+x)$ total heads. The expected number of heads is $n^2$ and the standard deviation is $\sqrt{2n^2}/2 = n/\sqrt{2}$, so $n(n+x)$ heads is $\sqrt{2} x$ standard deviations away from the mean. By the central limit theorem, we expect the probability of $n(n+x)$ heads to be a factor of $e^{-(\sqrt{2} x)^2/2} = e^{-x^2}$ smaller than the probability of $n^2$ heads. This part is rigorous.

Here comes the fuzzy part. The most likely way to get $n^2$ heads is that each group gets $n$, and the most likely way to get $n(n+x)$ heads is that each group gets $n+x$. So it doesn't surprise me that the ratio is also $e^{-x^2}$ here.