Why is $\lim\limits_{n\to\infty } 1=0$ incorrect?

507 Views Asked by At

$$ \lim_{n\to\infty } 1 =\lim_{n\to\infty }\frac{n}{n} =\lim_{n\to\infty }\frac{\overbrace{1+1+\ldots+1}^{n \text{ times}}}{n} =\lim_{n\to\infty }\frac{1}{n} + \lim_{n\to\infty }\frac{1}{n} + \ldots =0. $$ Clearly this is incorrect, but why?

2

There are 2 best solutions below

5
On BEST ANSWER

Because $\infty\cdot0$ is undetermined. What you wrote is the same as $1=\displaystyle\lim_{n\to\infty}\frac nn=\lim_{n\to\infty}n\cdot\lim_{n\to\infty}\frac1n=$ $=\infty\cdot0=\underbrace{0+0+0+...}_{\begin{align}\text{conveniently 'forgetting' to }\\\text{mention the 'number' of 0's}\end{align}}\overset{\text{"obviously"}}=0$. By the same token, $\displaystyle\lim_{n\to\infty}\left(1+\frac1n\right)^n=\left(1+0\right)^\infty$ $=$ $=1\cdot1\cdot1\cdot\,...=1\neq e$.

2
On

Unlike what Lucian said, it is possible to give a precise meaning (think measure theory) to an infinite sum of limits, and the answer can be 0 under suitable conventions. However, the problem was in the step of splitting the limit. There is a theorem that shows that you can split a limit of a sum of two terms into the sum of their limits provided they exist. That can never result in an infinite sum.