Why are all convergent sequences necessarily Cauchy?

1.1k Views Asked by At

I can understand the proof, which I could do myself:

$|s_n - s_m| = |s_n - s + s - s_m|$

$\Rightarrow |s_n - s_m| \leq |s_n - s| + |s_m - s| $

For some $\epsilon > 0, \exists\ \ N(\epsilon) \in \mathbb{N} s.t.$

$ |s_n - s_m| \leq \epsilon + \epsilon \ \ \forall \ \ m,n > N(\epsilon)$

$\therefore |s_n - s_m| \leq 2\epsilon$ where $2\epsilon \in \mathbb{R}$

The fact that the Cauchy sequence gradually closes up, i.e. the elements get closer together in obvious from the last point. $N(\epsilon)$ necessarily is a non-increasing function of $\epsilon$.

But I can't see this physically. I can think of instances in which the sequence may close up, then diverge, then close up again at infinity. What stops the sequence from coming as close as $\epsilon_1$ to the limit, then diverging, then coming back after some elements.

Of course, the Cauchy sequence does not allow this, but I don't see how this instance violates the basic definition of an existence of a limit. The basic definition simply requires that $\forall \epsilon >0\exists N \in \mathbb{N}\ s.t. |s_n -s| < \epsilon$. There is no rule on the $\epsilon$ increasing with $N$.

5

There are 5 best solutions below

7
On BEST ANSWER

Remember that the definition of a sequence $s_n$ converging to $s$ means for every $\epsilon > 0$ there's some $N(\epsilon)$ so that $$ \lvert s_n - s \rvert < \epsilon $$ for every $n \ge N$.

The important part is that after some fixed point in the sequence all you do is get closer and closer to $s$. Intuitively I hope you can then see since the sequence only gets closer to $s$ all the terms after $s_N$ will be getting closer and closer as well (i.e. it's a cauchy sequence).

To target your specific counter example of a sequence that gets close to $s$ then gets really far from $s$ then converges back to $s$ again note again that the $N$ needs to be so that all terms after $s_N$ need to be getting closer and closer to $s$. So the $N$ that would need to be chosen would need to be at the point in the sequence when it goes back to converge a second time (since otherwise it wouldn't be true that $s_n$ is close to $s$ for every $n \ge N$.

I hope this is clear and helps. Let me know if this is confusing!

6
On

Suppose $x_n\to x$. Then, fix $\varepsilon>0$. There exists an $N\in\mathbb N$ such that for all $n\in\mathbb N$ $$n\geq N\implies |x_n-x|<\frac{\varepsilon}{2}.$$

Then, if $n,m\geq N$, $$|x_n-x_m|\leq |x_n-x|+|x_m-x|<\frac{\varepsilon}{2}+\frac{\varepsilon}{2}=\varepsilon.$$

Q.E.D.

4
On

You seem to have a mis-conception about the Cauchy condition. Rather than address it I offer you a different way of looking at things which I hope will clarify everything. For a sequence $x_n$ consider the condition: $$\exists L \forall \varepsilon >0 \exists N s.t. |x_n-L|<\varepsilon \forall n>N$$ which is nothing but the familiar notion that $L$ is the limit of the sequence. Now change the quantifiers to get $$\forall \varepsilon >0 \exists L \exists N s.t. |x_n-L|<\varepsilon \forall n>N$$ which expresses something quite different. In the first you first fix the limit, and then you need to produce $N$ for every $\varepsilon $. In the second one you first get the $\varepsilon $, then you may choose $L$ and then produce the $N$. So if the first expression is the definition of limit, then the second condition is the definition of a sort of varying limit. Now, it's an easy exercise that the second condition is equivalent to the condition of the sequence being Cauchy.

Thus the difference between a sequence being Cauchy and converging is in the order of quantification. Notice that it is now a triviality that if a sequence converges then it is Cauchy.

0
On

Suppose a sequence converges to a limit. This means that for any $\epsilon$, we can find an segment of length $2\epsilon$ in which the sequence will eventually lie (the limit being on the middle of the segment). Once its in there, the elements of the sequence can't be further than $2\epsilon$ from one another (because the furthest any two elements could get would be one on each end of the segment).

0
On

. I can think of instances in which the sequence may close up, then diverge, then close up again at infinity.

This can't happen! The definition says there is some index beyond which all elements are a fixed amount of close right? So that condition puts regularity on the seperation. Anyway, once one draws a pic, it's really easy to see why this theorem is true in a typical metric space:

enter image description here

I have shown here the sequence as a blue ticks on a white canvas, the red guide lines show how the sequence moves along the canvas as the index increases. I consider a ball of radius $\frac{\epsilon}{2}$ around the limit $L$ and zoom into with a super strong magnifying glass.

It is clear that $|a_N-a_M | < |a_N - L | + |a_M - L| $ due to the triangle inequality, now we can consider that the radius of ball is $\frac{\epsilon}{2}$, hence any line segment in it must be less than the radius:

$$ |a_N - a_M| < |a_N - L | + |a_M - L| < \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$$

But, due to the definition of convergence, we have an index $n$ such that all terms $n,m$ etc with index higher than it are contained in the half epsilon ball. So, we can set epsilon as small or large as we want. This is the definition of Cauchy. QED.