How do we know that the limit of $(1+1/n)^n$ exists as $n$ increases without bound?

1.1k Views Asked by At

I am starting to learn Calculus now and my learning pathway for finding the derivatives of exponentials and logarithms goes like this:

  1. I assume that $\lim_{n\to+\infty}(1+ \frac{1}{n})^n=e$. I accept this by definition.
  2. Then, I do all the necessary computations to prove other stuffs from it.

Here, I am confused about two things:

  1. Why would such a limit even exist? Well, you can show me the graph of the function but still that does not explain in a proper way why it exists ( It shows how that exists ).
  2. Why is $\lim_{n\to-\infty}(1+\frac{1}{n})^n=e$ ?
6

There are 6 best solutions below

0
On

You can show that the sequence is monotone and bounded. Since $\mathbb R$ is complete this gives you the convergence. The second thing is really just by definition. You define $e$ to be that limit and everything else you know about $e$ has to be shown using that definition of $e$.

0
On

Use AM-GM to prove that $$\left(1+\frac1n\right)^n<\left(1+\frac{1}{n+1}\right)^{n+1}$$

Now, give it any bound, let's say $\left(1+\frac1n\right)^n<3$. Now, argue using monotone convergence.

0
On

See suppose you have a function $$f(x)=\frac{\sin x}{x}.$$ Now at $x=0$, $f(x)=\frac{\sin 0}{0}$ which does not have any meaning because $f(x)$ can't be defined at $x=0$. But what if $x=10^{-50}$? $f(x)$ is not defined at $x=0$ but it can be defined at $x=10^{-50}$. Now, if we see the graph of $f(x)=\frac{\sin x}{x}$ we see as $x$ is going near to $0$, the value of $f(x)$ goes near to $1$. That does not mean $f(x)=1$. Thus we say limit exists for $f(x)$ as $x$ tends to $0$. Limits is an important concept, which helps to define derivative.

As for your second question, $e$ is defined like that. This is definition of $e$ and all other properties of $e$ come from this definition.

0
On

Expand by the Binomial theorem, it is $$1+{n\choose1}\frac1n+{n\choose2}\left(\frac1n\right)^2+...\\ =1+\frac nn+\frac12\frac{n(n-1)}{n^2}+...$$ Each term is $1/k!$ times a fraction that increases to 1.
For the second question $1-1/n$ is close to the reciprocal of $1+1/n$ when $n$ gets large. For example,$1.01×0.99=0.9999$ is very close to 1. So $(1-1/n)^{-n}=1/(1-1/n)^n$ is close to $(1+1/n)^n$

0
On

Since you are in calculus, I assume you haven't gone through real analysis, but to give you a satisfactory answer, I need to introduce you the definition of limit of a sequence. Now what does it mean for a limit of a sequence to exists or a sequence converging to a number (which we say the limit of sequence)? They say for any real sequence $(x_n)$ to converge to a real number $x$, there must exist a natural number $k$ for a given $\epsilon> 0$ (as small as we please!), such that for all $n\geq k$, the terms $x_n$ satisfies $|x_n-x|< \epsilon$.

In other words, if I claim that a sequence is converging to some number or its limit exists, then whatever positive value of $\epsilon$ you put in front of me, I can always locate the term in the sequence, after which the distance between the terms and the proposed limit is always less than your given value of $\epsilon$.

This is the fundamental definition of limit of a sequence. Related to this there's another theorem, by which we can indirectly claim a sequence is convergent or its limit exists. It says that, if any sequence is monotonically increasing or decreasing and is bounded, then it must be converging. Note that only bounded sequence doesn't imply convergence, for example, oscillatory sequence could be bounded but they are not convergent. We will use this here.

Coming back to your problem, we consider the sequence $e_n= \left(1+\dfrac{1}{n}\right)^n$ for $n\in\mathbb{N}$. We will show that the sequence is bounded and increasing, and hence is convergent or limit exists. Applying Binomial Theorem, we have

$e_n= \left(1+\dfrac{1}{n}\right)^n= 1+\dfrac{n}{1}\cdot \dfrac{1}{n}+\dfrac{n(n-1)}{2!}\cdot \dfrac{1}{n^2}+\dfrac{n(n-1)(n-2)}{3!}\cdot \dfrac{1}{n^3}+\cdots+\dfrac{n(n-1)\ldots2\cdot 1}{n!}\cdot\dfrac{1}{n^n}$

If we divide the powers of $n$ into the terms in the numerators of binomial coefficients, we get

$e_n= 1+1+\dfrac{1}{2!}\left(1-\dfrac{1}{n}\right)+\dfrac{1}{3!}\left(1-\dfrac{1}{n}\right)\left(1-\dfrac{2}{n}\right)+\cdots+\dfrac{1}{n!}\left(1-\dfrac{1}{n}\right)\left(1-\dfrac{2}{n}\right)\cdots\left(1-\dfrac{n-1}{n}\right)$

Similarly we have

$e_{n+1}= 1+1+\dfrac{1}{2!}\left(1-\dfrac{1}{n+1}\right)+\dfrac{1}{3!}\left(1-\dfrac{1}{n+1}\right)\left(1-\dfrac{2}{n+1}\right)+\cdots+\dfrac{1}{n!}\left(1-\dfrac{1}{n+1}\right)\left(1-\dfrac{2}{n+1}\right)\cdots\left(1-\dfrac{n-1}{n+1}\right)+\dfrac{1}{(n+1)!}\left(1-\dfrac{1}{n+1}\right)\left(1-\dfrac{2}{n+1}\right)\cdots\left(1-\dfrac{n}{n+1}\right)$

Note that $e_{n}$ contains $n+1$ terms while $e_{n+1}$ contains $n+2$ terms. Moreover each term appearing in $e_n$ is less than or equal to the corresponding term in $e_{n+1}$, and $e_{n+1}$ has one more positive term. Therefore we have $2\leq e_1< e_2< \cdots< e_n< e_{n+1}< \cdots,$ so the terms of $(e_n)$ are increasing.

To show that the terms of $(e_n)$ is bounded above, we note that if $p= 1, 2, \ldots, n$ then $(1-p/n)< 1$. Moreover $2^{p-1}\leq p!$, so $1/p!\leq 1/2^{p-1}$. Therefore if $n> 1$ then we have

$2< e_n< 1+1+\dfrac{1}{2}+\dfrac{1}{2^2}+\cdots+\dfrac{1}{2^{n-1}}$.

Since it can be verified that $ 1+1+\dfrac{1}{2}+\dfrac{1}{2^2}+\cdots+\dfrac{1}{2^{n-1}}= 1-\dfrac{1}{2^{n-1}}< 1$, we deduce that $2< e_n< 3$ for all $n\in\mathbb{N}$. Thus its bounded and increasing implies convergent limit exists.

0
On

We have : $$ \left( 1+\frac{1}{n}\right)^n=e^{n\ln\left( 1+\frac{1}{n}\right)} $$ So the limit when $n$ tend to $+\infty$ is : $$\begin{align} \lim_{n\to +\infty} e^{n\ln\left( 1+\frac{1}{n}\right)}&=\lim_{n\to +\infty} e^{n \times \frac{1}{n}}\\ &=e^1\\ &=e \end{align}$$ Let me explain what I've done, you'll wonder why did I replaced $\left( 1+\frac{1}{n}\right)$ with $\frac{1}{n}$, (I'm assuming you're new to the natural logarithm, sorry if you found my explanation basic) : $$\lim_{x\to 0} \frac{\ln(x)}{x-1}=\frac{\mathrm{d} \ln x}{\mathrm{d}x}\Bigg\vert_{x=1}=\frac{1}{1}=1$$ So when $x$ is close to $0$ we have : $$\frac{\ln(x)}{x-1}=1\Longleftrightarrow \ln(x)\sim x-1 $$ Substituting $x-1=t$ we'll obtain : $$\ln(1+t)\overset{t\to 0}{\sim} t$$
In this example you have : $$\lim \ln\left( 1+\frac{1}{n}\right) $$ when $n \to \infty \Longleftrightarrow \frac{1}{n} \to 0$ so : $$\ln\left( 1+\frac{1}{n}\right)\sim \frac{1}{n}$$ And I replaced it in the limit and hence the result. I'm assuming that's the only thing you don't understand.\ Generally you may think of this as a technique to evaluate limits : $$\ln\left(1+f(x)\right)\sim f(x)$$ But pay attention you have to verify that : $$\lim_{x\to x_0}1+f(x) =1$$