Let $I_n$ and $J_k$ be sequences of bounded intervals. $I_n$ are pairwise disjoint and $J_k$ are open. $\ell(I)$ denotes the length of the bounded interval $I$.
Suppose $\bigcup_{n=1}^NI_n\subset\bigcup_{k=1}^\infty J_k$. Then the $J_k$ form an open cover for the set $\bigcup_{n=1}^N I_n$.
Suppose $\sum_{n=1}^N\ell(I_n)>\sum_{k=1}^M\ell(J_k)$, for any $M$. Prove that the sets ($J_k$) form an open cover for $\bigcup_{n=1}^N I_n$ that admits no finite subcover.
Suppose $\bigcup_{k=1}^N I_n \subset \bigcup_{k=1}^M J_k$ for some $M \in \mathbb{N}$, then we have $$\lambda(\bigcup_{k=1}^N I_n) = \sum_{k=1}^N \lambda(I_n),$$ because the intervals $I_1,\ldots,I_N$ are disjoint. On the otherhand, we have $$\lambda(\bigcup_{k=1}^N I_n) \le \lambda(\bigcup_{k=1}^M J_m) \le \sum_{k=1}^M \lambda(J_k)$$ by monotonicity of measures and sub-additivity of measures. This is impossible, because $$ \sum_{k=1}^N \lambda(I_n)> \sum_{k=1}^M \lambda(J_k)$$ by assumptation.