Rudin's Principles of Mathematical Analysis, Theorem 2.41.

164 Views Asked by At

Let $E$ be a subset of $R^{k}$, with the Euclidean metric on it. Then $1.$ implies $2.$, where

$1.$ Every infinite subset of $E$ has a limit point in $E$

$2.$ $E$ is closed.

Proof Suppose by contradiction that $E$ is not closed. Since by definition a subset of a metric space is said to be closed if it contains all its limit points, we are assuming there exists a point $x_0$ of $\mathbb{R}^k$ which is a limit point of $E$ and which lies in the complement of $E$ with respect to $\mathbb{R}^k$. For every $n\in\mathbb{N}$, there are points $x_n\in E$ such that $|x_n-x_0|<\frac{1}{n}$. Let $S$ be the set of these points. Then $S$ is infinite (otherwise $|x_n-x_0|$ would have a constant positive value, for infinitely many $n$)....

I can't understand the "otherwise" argument. Since He said $S$ is infinite, I suppose that "otherwise" means that $S$ is by contradiction assumed to be finite. But then how can we consider infinitely many indices $n$, so that $x_n$ is a point of $S$ for each $n\in\mathbb{N}$? I think the only way is to "count" some of them infinitely many times. But then how can the condition $|x_n-x_0|<\frac{1}{n}$ hold for all of them, considering that $\frac{1}{n}$ converges to $0$, the only point $x_n$ with $|x_n-x_0|=0$ is $x_0$ and $x_0\notin S$?

Could you please expand Rudin's argument so to make it more understandable to me? Thank you.

1

There are 1 best solutions below

0
On BEST ANSWER

To elaborate on the "otherwise", let us look at the negation.
That is, let us ask ourselves what would happen if that were not true.


Suppose that $S$ were finite.
Note that $S$ was constructed in a way such that we have the following property:

For every $n \in \Bbb N$, there exists $x_n \in S$ such that $|x_n - x_0| < 1/n$.

Now, since $S$ is finite, this means that not all $x_n$ can be distinct. In fact, it would mean that there's some $x' \in S$ such that $$x' = x_n\quad\text{for infinitely many } n \in \Bbb N.$$

(Why? If that didn't happen, then $\Bbb N$ could be written as a finite union of finite sets.)

Let $A$ be the collection of all such $n$ that satisfy the above condition. This is an infinite set. What Rudin is saying is that

$$|x_n - x_0| \text{ is consant for all } n \in A.$$ And that is simply because $|x_n - x_0| = |x' - x_0|$ for all $n \in A$.

This is why $|x_n - x_0|$ is constant for infinitely many $n$.
Moreover, this constant is positive because $x_0 \notin S \ni x'$ and thus, $x' \neq x_0$.


The contradiction that he would derive from this is that a fixed positive number $\delta = |x' - x_0|$ is less than $1/n$ for infinitely many $n$. (Since it's infinitely many, you choose an $n > 1/\delta$ and arrive at the contradiction.)