Getting rid of unnecessary sets from uncountable union of subsets

91 Views Asked by At

So if $X$ is a set and $\lbrace A_i : i\in I\rbrace$ is an uncountable collection of subsets of $X$, I want to simplify $\cup_{i\in I}A_i$ by getting rid of all sets that do not change the union i.e. the ones that are contained in the union of all of the other sets.

How can I show that $\cup_{i\in I}A_i = \cup_{i\in J}A_i$, where $J=\lbrace i\in I:A_i\not\subseteq \cup_{j\neq i}A_j\rbrace$ ?

Does it involve something along the lines of Zorn's Lemma/Axiom of Choice? I'm sure there's something really silly that I'm missing. A hint would be very much appreciated! :)

3

There are 3 best solutions below

0
On BEST ANSWER

Counterexample: Let $I=\mathbb R$ and $A_i=(-\infty,i)$.

Then $\bigcup_{i\in I}A_i=\mathbb R$ and $J=\{i\in I:A_i\not\subseteq\bigcup_{j\ne i}A_j\}=\emptyset$ so $\bigcup_{i\in J}A_i=\emptyset$. In fact, this family $\{A_i:i\in I\}$ has no minimal subfamily which covers $\mathbb R$.

Another example: Consider the family $\{A_i:i\in I\}$ of all $2$-element subsets of $\mathbb R$. Again each $A_i$ is "unnecessary", but in this case there is a minimal subcover, e.g., $\{\{x,x+1\}:\lfloor x\rfloor\text{ is even}\}$.

In fact, a family of sets of bounded finite size always has a minimal subfamily with the same union, though it can't be obtained by simply throwing out all the "unnecessary" sets from the original family; see Taras Banakh's answer to this Math Overflow question. See this other question for some related stuff.

6
On

The statement

$\cup_{i\in I}A_i = \cup_{i\in J}A_i$

with the given definition of $J$ is not true. Counterexample: $$ I=\{1;2\}, \quad A_1=A_2\neq\emptyset $$ Then $J=\emptyset$ and $\cup_{i\in I}A_i = A_1\neq\emptyset= \cup_{i\in J}A_i$.

If $I$ is infinite, again take $A_i=A\neq\emptyset $ for all $I$. Then $J=\emptyset$ and the statement fails.

It would work if the indexes $i$ are in a well ordered set and you write $J=\lbrace i\in I:A_i\not\subseteq \cup_{j<i }A_j\rbrace$

0
On

I think that you have a cover $(A_i)_{i\in I}$ and you want to find a subcover that is minimal $(A_i)_{i\in J}$. As the example of @bof: shows, that is not possible: $J$ forms a subcover if and only if $\sup J= \infty$ and removing a finite subset of $J$ still gives a subset $J'$ with $\sup J' = \infty$.

The problem with a minimal cover is that the interserction of a decresing family of covers may not be a cover, so Zorn lemma does not work.