The proof of Ascoli's theorem uses the Cantor diagonal process in the following manner: since $f_n$ is uniformly bounded, in particular $f_n(x_1)$ is bounded and thus, the sequence $f_n(x_1)$ contains a convergent subsequence $f_{1,n}(x_1)$. Since $f_{1,n}$ is also bounded then $f_{1,n}$ contains a subsequence $f_{2,n}$ such that $f_{2,n}(x_2)$ is convergent. Iteratively, the sequence $f_ {k-1,n}$ contains a convergent subsequence $f_{k,n}$ such that $f_{k,n}(x_k)$ is convergent and $f_{k,n}(x_i)$ are also covergent whenever $i\leq k$ since $f_{i,n}(x_i)$ is convergent for all $i\leq k.$ As $k\to\infty$, then we have $f_{i,n}(x_i)\to f(x_i)$, we now construct the diagonal sequence $f_{k,k}$ convergent for each $x_i\in I$. The last step concerning the diagonal sequence $f_{k,k}$ is unclear to me, what confuses me are the integers $i$, $k$ and $n$. I need a clear and distinguishable explanation.
Cantor diagonal process in Ascoli's theorem proof
800 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
I see that you understand that we start with a sequence $(f_n)$ and recursively define sequences $(f_{k,n})$ and points $y_k$, $k =1, 2, ... $, such that
(1) $(f_{k,n})$ is a subsequence of $(f_{k-1,n})$ such that $(f_{k,n}(x_k))$ converges to $y_k$
where we set formally $f_{0,n} = f_n$.
Then clearly
(2) $(f_{k,n})$ is a subsequence of $(f_{i,n})$ for $i = 0,...,k$
(3) $\lim_{n \to \infty} f_{i,n}(x_i) = y_i$ for $i = 1,...,k$
$f_{k,n}$ is defined for all $(k,n)$, so we can define
$$f'_n = f_{n,n} .$$
This is a subsequence of $(f_{i,n})$ for each $i = 0,1,2, ...$, in particular one of $(f_n)$. But then for all $i$ $$\lim_{n \to \infty} f_{n,n}(x_i) = \lim_{n \to \infty} f_{i,n}(x_i) = y_i .$$
On
A subsequence of a sequence $(x_n)_n$ is just the same as an infinite subset $A \subseteq \mathbb{N}$. If the subset $A$ is written (uniquely) in strictly increasing order as $A=\{n_1,n_2,n_3, \ldots\}$ with $n_i < n_j$ iff $i<j$, then the subsequence is also denoted $(x_{n_k})_k$, but this can get confusing when we take sub-subsequences etc. as we are doing here (introducing more and more indices). So I prefer to just write $(x_i)_{i \in A}$ instead.
I'd write it as: $f_n(x_1)$ is bounded so there is a subsequence $A_1$ such that $(f_i(x_1))_{i \in A_1}$ is convergent (to $y_1$ say). Then having $A_1 \supseteq A_2 \supseteq A_3 \ldots \supseteq A_n$ already defined (all infinite) such that $(f_i(x_n))_{i \in A_j}$ converges to $y_j$ for $j=1\ldots n$, then $f_{i}(x_{n+1}), i \in A_n$ is bounded and so there is some $A_{n+1} \subseteq A_n$ infinite such that $f_i(x_{n+1})$ converges to some $y_{n+1}$, etc. (recursive step) This finishes the recursive definition.
So we get a decreasing sequence of sets $A_n$.
Then let $e_n(A)$ be the $n$-th element of an infinite set $A$ (in its increasing order). Then the diagonal subsequence is just the sequence $n \to f_{e_n(A_n)}$. This is an infinite subset as $e_n(A_n) \le e_n(A_{n+1}) < e_{n+1}(A_{n+1})$ for all $n$. It converges pointwise to $y_n$ at the point $x_n$, for all $n$.
We are decreasing the index set of interest at each step.
Let $a_0:=(1,2,3,\dots) $. Then let $a_1$ be the sequence of indices making $f_{a_1(n)}(x_1)$ convergent.
Then $a_{k+1}$ picks indices only from $a_k$, making the required convergence.
Finally, consider $a:=(a_1(1), a_2(2), a_3(3), \dots)$ which is a subsequence of each $a_k$, with finitely many exceptions.