When trying to show:
$m^*(\cup A_n) \leq \sum m^*(A_n)$
where $n$ ranges over some countable indexing set. The standard proof goes like this:
1.) if the RHS is infinite then the inequality is true
2.) prove that the LHS is $\leq$ RHS + $\epsilon$ for arbitrary $\epsilon > 0$, to do this we consider a sequence of coverings of each set $A_n$, each covering being made up of a sequence of $I_m^n$ intervals which has measure $\leq m^*(A_n) + \frac{\epsilon}{2^n}$. We then line up all the intervals, sum them up and conclude that since $\cup A_n \subseteq \cup I_m^n$ that $m^*(\cup A_n) \leq \sum m^*(I_m^n) \leq \sum m^*(A_n)+ \epsilon$ then let $\epsilon$ go to $0$ and we're done.
My question is why do we need to go through the medium of using intervals? It feels like there's some subtlety that I'm missing. If we can just assert that:
$\cup A_n \subseteq \cup I_m^n \implies m^*(\cup A_n) \leq \sum m^*(I_m^n)$
then why can't we skip the whole interval kerfuffle and just write:
$\cup A_n \subseteq \cup A_n \implies m^*(\cup A_n) \leq \sum m^*(A_n)$
What is the proof of the first assertion? Why can't we just use the same logic for the second?
The reason why you can do it with intervals is because the definition of this outer measure is: (assuming that you already know how it is defined for intervals)
$m^*(A)=\inf\{\sum_{n=1}^\infty m^*(I_n): A\subseteq \cup_{n=1}^\infty I_n, \ I_n$ are intervals$\}$
So obviously if $\cup A_n\subseteq \cup I_m^n$ when $(I_m^n)_{n=1}^\infty$ is a sequence of intervals then $m^*(\cup A_n)\leq \sum m^*(I_m^n)$, simply because the infimum of a set can't be greater than an element of that set. But if $(I_m^n)$ is a sequence of sets which are not intervals then it is not so trivial that the same inequality holds. This is actually what you have to prove.