I'm working on measures and I need to prove the following:
There exists no measure on $([0,1]\mathcal{P}([0,1]))$ such that
- $\mu([0,1]) = 1$
- $\mu(A) \in \{0,1\}$ for any $A \subseteq[0,1]$
- $\mu(F) = 0$ for any $F \subseteq[0,1]$ finite
I understand why this is so and to some extent how to prove it:
1) We suppose there is such a measure.
2) We split the interval in two disjoint intervals and use the additivity of measures on disjoint sets to show that one of the subintervals has measure 1 and the other one 0.
3) We do it recursively and we obtain a sequence of intervals of measure 1 that converges to a singleton of measure 1.
4) This is a contradiction since the measure of finite sets is 0.
I'm not sure how to mathematically write step 3) in a rigorous way. Could anyone please help me? I'm pretty sure this is a standard exercise for an undergraduate course on Lebesgue theory but I didn't manage to find a proof online. If there has already been such a post, a redirection would be appreciated. Thanks in advance.
We define the sets $A_n$ recursively as follows:
Let $A_0=[0,1]$. Given an interval $I=[a,b]$ let $f(I):=[a,\frac{a+b}{2}]$ and $g(I):= [\frac{a+b}{2},b]$ we clearly have that $I=f(I)\bigcup g(I)$ (also the union is disjoint up to measure zero).
Assuming we constructed $A_n$ such that $\mu(A_n)=1$, we construct $A_{n+1}$ as follows:
Look at $f(A_n),g(A_n)$, then we have that $A_n=f(A_n)\cup g(A_n)$ is of measure $1$. Therefore $\mu(f(A_n))+\mu(g(A_n))=1$ and so either $f(A_n)$ or $g(A_n)$ is of measure $1$. Let $A_{n+1}$ be this interval (the one of measure $1$).
We obtain a sequence of intervals $A_n$ such that $\mu(A_n)=1$. Moreover $A_{n+1}$ is either $f(A_n)$ or $g(A_n)$ and so it's length is half the length of $A_n$.