Using the Central Limit Theorem to show $\lim_{n \to \infty} \frac{1}{(n-1)!} \int_0^n x^{n-1}e^{-x} dx= 1/2$

1.2k Views Asked by At

I want to use the Central Limit Theorem to show that the following limit holds: $$\lim_{n \to \infty} \frac{1}{(n-1)!} \int_0^n x^{n-1}e^{-x} dx= 1/2.$$

If I let $X_1, X_2, \ldots$ be an i.i.d. sequence of exponentially $\lambda=1$ distributed random variables then I know that $\mathbb{E}(X_i)=1$ and $\mathbb{V}(X_i)=1$ for all $i \geq 1$. Let $S_n=X_1+\ldots+X_n$. In order to use the Central Limit Theorem, I need $\mathbb{E}((X_i)=0$ and $\mathbb{V}(X_i)=1$.

I don't see how exactly to get the integral on the left and I definitely don't see where 1/2 comes from. I am fairly new to probability, so I apologize if this question seems easy.

2

There are 2 best solutions below

0
On BEST ANSWER

$\newcommand{\P}{\mathbb{P}}$I'll give you the idea on how to solve this. First you should really see the LHS as the CDF of the Erlang distribution. Let $X_i \sim \exp(1)$ i.i.d. then $Y_n=\sum_{i=1}^n X_i \sim \text{Erlang}(1,n)$. The CDF of $Y_n$: \begin{align}\tag{1} F_{Y_n}(y)=\P(Y_n<y) = \int^y_0 \frac{t^{n-1}e^{-t}}{(n-1)! }\, dt \end{align} On the other hand by the CLT (check the conditions!) we have: \begin{align}\tag{2} \lim_{n\to\infty}\P\left( \frac{\sum_{i=1}^nX_i - nE[X_1]}{\sqrt[]{n\text{Var}(X_1)}} <z \right)=\lim_{n\to\infty}\P\left( \frac{\sum_{i=1}^nX_i - n\cdot 1}{\sqrt[]{n\cdot 1}} <z \right)=\Phi(z) \end{align} where $\Phi(\cdot)$ is the CDF of the standard normal distribution. What is left to do is to rewrite (1) in such a way you get something similar to (2). Can you finish? I have put the solution in a spoiler box for you to check.

Solution

\begin{align} \lim_{n\to\infty} \int^n_0 \frac{t^{n-1}e^{-t}}{(n-1)! }\, dt = \lim_{n\to\infty} \P\left( \sum_{i=1}^n X_i< n \right) = \lim_{n\to\infty} \P\left(\sum_{i=1}^nX_i -n <0\right) = \lim_{n\to\infty}\P\left( \frac{\sum_{i=1}^nX_i - n}{\sqrt[]{n}} <0 \right) = \Phi(0) \end{align} One knows that $\Phi(0)=\frac{1}{2}$, by looking it up or using the symmetry of the Gaussian pdf.

0
On

You can use the following fact: Given $X_1,...,X_n$ independent random variables with $X_i$~$Exp(\lambda)$, $\forall i=1, \ldots , n$ then if $Z_n= X_1+...+X_n$ the density function of $Z_n$ is given by
$$f_{Z_n}(x)=\frac{x^{n-1}}{(n-1)!}e^{-\lambda x} I_{[0, \infty)}$$.
Thus
$$\frac{\lambda^n}{(n-1)!}\int_{0}^{\frac{n}{\lambda}}x^{n-1}e^{-\lambda x}dx=\int_{0}^{\frac{n}{\lambda}}f_{Z_n}(x)dx=\int_{-\infty}^{\frac{n}{\lambda}}f_{Z_n}(x)dx=F_{Z_n}\left(\frac{n}{\lambda}\right)=\mathbb{P}\left[X_1+ \ldots +X_n \leq \frac{n}{\lambda}\right]$$
You can finish?