Prove that there is no finite subcover

168 Views Asked by At

Let $I_n$ and $J_k$ be sequences of bounded intervals. $I_n$ are pairwise disjoint and $J_k$ are open. $\ell(I)$ denotes the length of the bounded interval $I$.

Suppose $\bigcup_{n=1}^NI_n\subset\bigcup_{k=1}^\infty J_k$. Then the $J_k$ form an open cover for the set $\bigcup_{n=1}^N I_n$.

Suppose $\sum_{n=1}^N\ell(I_n)>\sum_{k=1}^M\ell(J_k)$, for any $M$. Prove that the sets ($J_k$) form an open cover for $\bigcup_{n=1}^N I_n$ that admits no finite subcover.

2

There are 2 best solutions below

0
On

Suppose $\bigcup_{k=1}^N I_n \subset \bigcup_{k=1}^M J_k$ for some $M \in \mathbb{N}$, then we have $$\lambda(\bigcup_{k=1}^N I_n) = \sum_{k=1}^N \lambda(I_n),$$ because the intervals $I_1,\ldots,I_N$ are disjoint. On the otherhand, we have $$\lambda(\bigcup_{k=1}^N I_n) \le \lambda(\bigcup_{k=1}^M J_m) \le \sum_{k=1}^M \lambda(J_k)$$ by monotonicity of measures and sub-additivity of measures. This is impossible, because $$ \sum_{k=1}^N \lambda(I_n)> \sum_{k=1}^M \lambda(J_k)$$ by assumptation.

0
On

Below is an argument for proving the statements with the given assumptions, but perhaps more interesting is the question of whether or not the assumptions can ever be satisfied, raised by @freakish in the comments. I believe the following argument shows they cannot hold as stated, so the statement is vacuously true.

Suppose there is a disjoint family of intervals $I_1$, $\ldots$, $I_N$ and open intervals $J_1,\ldots$ such that $\bigcup_{n=1}^N I_n \subseteq \bigcup_{m=1}^{\infty} J_m$ and yet $\sum_{n=1}^N \ell(I_n)>\sum_{m=1}^M \ell(J_m)$ for any $M$.

Let me dip into measure theory so that this argument isn't too clunky; I'll use $\ell$ for Lebesgue measure. It follows (taking limits) that $\ell(\bigcup I_n)= \sum_{n=1}^N \ell(I_n)\geq \sum_{m=1}^\infty \ell(J_m)\geq \ell(\bigcup_{m=1}^{\infty} J_m)\geq\ell(\bigcup I_n)$, thus $\ell(\bigcup I_n)=\sum_{m=1}^\infty \ell(J_m)=\ell(\bigcup_{m=1}^{\infty} J_m)$. But since the $J_m$ are open intervals, this last equality is only possible if they are pairwise disjoint. (Otherwise, the series would be strictly larger by at least the length of any intersection.)

But if the $J_m$ are disjoint and open, they can only cover an interval $I_n$ if $I_n\subseteq J_m$ for some $m$ (otherwise, an interior endpoint of a $J_m$ is not covered), hence $I_n=J_m$ for some $m$. But this leads to several contradictions (the cover must be finite and equal to $I$ and the inequality cannot hold).


Note that it suffices to show that the families $J_1,\ldots, J_M$ do not cover $\bigcup I_n$; any finite subcover includes a set $J_M$ of maximal index, and adding the finitely many missing sets $J_i$ with $i<M$ to the cover does not change it being a finite subcover.

We proceed by contradiction. Suppose there is an $M$ for which $\bigcup_{n=1}^N I_n\subseteq \bigcup_{m=1}^M J_m$. Let $J_{m,n}=J_m\cap I_n$. Note that since $I_n$ and $J_m$ are intervals, $J_{m,n}$ is an interval thus we can consider $\ell(J_{m,n})$. Since the $I_n$ are disjoint, so are the $J_{m,n}$ and thus $\ell(J_m)=\sum_{n=1}^N \ell(J_{m,n})$.

On the other hand, since $I_n\subseteq \bigcup_{m=1}^M J_m$, in particular $I_n=\bigcup_{m=1}^M J_{m,n}$, so $\ell(I_n)\leq \sum_{m=1}^M \ell(J_{m,n})$. But then $$\sum_{n=1}^N \ell(I_n)\leq \sum_{n=1}^N\sum_{m=1}^M\ell(J_{m,n})=\sum_{m=1}^M\sum_{n=1}^N\ell(J_{m,n})=\sum_{m=1}^M\ell(J_m)$$ which contradicts the assumption that $\sum_{n=1}^N \ell(I_n)>\sum_{m=1}^M \ell(J_m)$.