If subsequent terms keep getting larger, does that mean no limit exists?

66 Views Asked by At

Take the following Taylor expansion:

$$ \dfrac{1}{1-x} = 1 + x + x^2 + x^3 + \cdots $$

This only holds for $ 0 \leq x < 1. $ Let's say you want to prove this doesn't hold for $x>1$.

You can say that $\dfrac{1}{1-x}$ will become negative, and a sum of positive numbers can never have a negative sum.

Intuitively, I was also thinking that for $x>1$, each subsequent term in the Taylor expansion becomes larger, so you have no limit. But are there any counter examples, where subsequent terms get larger but there still is a finite limit? Also, if so, what would be a better way to state my conjecture in a way that IS correct?

3

There are 3 best solutions below

0
On BEST ANSWER

If the for the $n$-th term $a_n$ of a series it is true that $\lim_{n \to \infty} a_n \ne 0$ then the series does not converge. This is exactly what you intuitively suspected. It has been proven and is referred to as the n-th term test for divergence.

2
On

If $\displaystyle\sum_{n=1}^\infty a_n$ converges, then $\lim\limits_{n\to\infty} a_n = 0$.

That is a standard theorem found in first-year calculus texts and in every analysis text that deals with this topic.

Of course, you find things on the internet that say $1+2+3+4+\cdots=\dfrac{-1}{12}$ and the like, but those deal with different kinds of convergence from the one usually intended in the result you state.

0
On

I could have used the necessary condition to prove this, but here's a more direct proof, just for the sake of looking at things from a different prospective.
Given this series: $$ \sum_{i = 1}^{k} a_i\ \ , \ \ \mbox{ subject to:} \ \ \ \ a_n=f(n) \land \begin{cases} f: \mathbb{N} \to \mathbb{R}\\ f(n+1)>f(n)>0 \end{cases} $$ this implies: $$ \sum_{i = 1}^{k} a_i=\sum_{i = 1}^{k} f(i) \ge \sum_{i = 1}^{k} f(1) = k\cdot f(1) >0 $$ now suppose: $$\lim_{k\to\infty}\sum_{i = 1}^{k} a_i = \lim_{k\to\infty}\sum_{i = 1}^{k} f(i) = \hat a$$ that would imply that: $$ \lim_{k\to\infty}(k\cdot f(1)) = f(1)\cdot\lim_{k\to\infty}k = \hat b \le \hat a $$

which is obviously impossible because: $$ f(1)\cdot\lim_{k\to\infty}k = +\infty $$ so we have that: $$ \sum_{i = 1}^{+\infty} a_i \ \ \ \mbox{ is a positively divergent series.} $$