Why can't well ordered sets have infinite decreasing subsequences?

633 Views Asked by At

Chapter 2 of Mathematics for Computer Science presents the following:

Define the set $\mathbb F$ of fractions that can be expressed in the form $\frac{n}{n+1}$. Define $\mathbb N$ as the set of nonnegative integers. $\mathbb N + \mathbb F$ is the set of numbers $n + f$ such that $n \in\mathbb N$ and $f \in\mathbb F$.

The book goes on to prove that $\mathbb N + \mathbb F$ is a well-ordered set. I understand that much.

The book then goes on to say that: In $\mathbb N + \mathbb F$, every element greater than or equal to 1 can be the first element in strictly decreasing sequences of elements of arbitrary finite length. Nevertheless, since $\mathbb N + \mathbb F$ is well-ordered, it is impossible to find an infinite decreasing sequence of elements in $\mathbb N + \mathbb F$, because the set of elements in such a sequence would have no minimum.

  1. Why can't there be such a sequence?

  2. I understand that I cannot just take the subset ${0, \frac{1}{2},\frac{2}{3},...,\frac{n}{n+1},...,1} $ and simply reverse it to obtain a counterexample. Can someone put into words why this is?

4

There are 4 best solutions below

2
On

To answer 1, an ordered set $X$ is well-ordered by definition if every nonempty subset of $X$ has a least element. Thus, if you follow the proof that $\mathbb{N}+\mathbb{F}$ is well-ordered, it cannot be the case that you have a subset without a least element. In particular, an infinitely descending sequence has no least element, so an infinitely descending sequence must not exist in $\mathbb{N}+\mathbb{F}$.

To answer 2, I think the confusion is coming from the fact that a sequence must be indexed by $\mathbb{N}$. This is a fancy way to say that it must have a first element, a second element, a third element, and so on, and that this process describes the entire sequence. You are correct that $\{0,\dotsc,\frac{n}{n+1},\dotsc, 1\}$ describes a subset of $\mathbb{N}+\mathbb{F}$. There are two problems with your suggestion that reversing the order gives you an infinitely descending sequence. First, we cannot change the ordering of $\mathbb{N}+\mathbb{F}$ — otherwise we would be answering questions about a different ordered set, which is not relevant to the stated problem. In other words, we need to work with the given rational ordering of $\mathbb{N}+\mathbb{F}$ to define a descending sequence. Second, look what happens if we now try to sequence points starting with 1 and going downwards: our second point will be some finite $\frac{n}{n+1}$, and there are only finitely many points in the set below this second point with which we can continue the sequence.

Hope this helps clear things up!

0
On

We can directly supply a proof that any descending sequence of $\mathbb N + \mathbb F$ must eventually be constant. But first it is helpful to note that if

$ \text{IF } n_0+ f_0 = n_1 + f_1 \; \text{where } n_0,n_1 \in \mathbb N \text{ and } f_0,f_1 \in \mathbb F$
$\quad\;\text{ THEN } n_0 = n_1 \text{ and } f_0 = f_1 $

Let $(a_k)_{k\ge0}$ be a decreasing sequence where each $a_k \in \mathbb N + \mathbb F$ and so write $a_k = n_k + f_k$. The decreasing sequence $(n_k)$ of natural numbers is clearly eventually constant and equal to an integer. By subtracting out this integer we can write down another sequence $(f_k)_{k \ge K}$. Now this sequence must also be decreasing and we can write

$$ f_k = \frac{m_k}{m_k+1}$$

and each $m_k$ is in one to one correspondence with each $f_k$. Since the $f_k$ are decreasing the $m_k$ must also be decreasing and so the sequence $(m_k)_{k \ge K}$ of natural numbers must eventually be constant and so the sequence $(f_k)_{k \ge K}$ must also eventually be constant.

It follows that the original sequence $(a_k)$ must eventually be constant.

0
On

To provide a “reason by analogy” answer to (2), I take it you agree that we can’t form an infinite decreasing sequence of natural numbers. Now consider the following argument:

“There is indeed an infinite decreasing sequence of natural numbers. Write out the natural numbers as 0, 1, 2, 3, …, which is an infinite increasing sequence, then reverse it to get an infinite decreasing sequence.”

Why doesn’t this work? Intuitively, the reason is that there’s an asymmetry in the natural numbers. There is a clear definitive “first” natural number (0 or 1, depending on your definition of natural number), but there’s no “last” natural number. We can form an infinite increasing sequence by using the fact that the “first” natural number works as a starting point, but if we were to reverse it, our sequence wouldn’t have a “first” element anymore - and sequences are supposed to have a clear first, second, third, etc. member.

There’s a similar effect at play for fractions of the form $\frac{n}{n+1}$ in that there’s a “first” fraction of this sort ($\frac{1}{2}$) but no “last” fraction, so we can’t reverse the sequence $\frac{1}{2}, \frac{2}{3}, \ldots$ in the way that’s needed for your attempted counterexample.

0
On

Re 1: Well-ordered sets can't have infinite descending sequences because that contradicts well-orderedness.

Suppose there is an infinite descending sequence $a_0 > a_1 > a_2 > ...$

By well-orderedness, the set $\{a_0, a_1, a_2, ... \}$ has a least element $a_n$. Then $a_n \le a_{n+1}$ (since it is a least element), but $a_n > a_{n+1}$ (since it is a descending sequence). Contradiction.