I'm reading the book Foundations of Modern Analysis by Avner Friedman and try to solve the exercise1.6.3 which is to prove the Lebesgue outer measure of $[a, b]$ is $b-a$.
My solution is that $(a+\epsilon, b-\epsilon) \subset [a, b] \subset (a-\epsilon, b+\epsilon)$ indicates $\mu^\star((a+\epsilon, b-\epsilon)) \le \mu^\star([a, b]) \le \mu^\star((a-\epsilon, b+\epsilon))$ where $\mu^\star$ denotes the Lebesgue outer measure.
The outer measure of open interval is its length hence $\mu^\star((a+\epsilon, b-\epsilon)) = b-a-2\epsilon$ and $\mu^\star((a-\epsilon, b+\epsilon)) = b-a + 2\epsilon$.
Let $\epsilon \to 0$ we derive that $\mu^\star([a,b]) = b-a$.
My problem is why the author gives a hint to apply Heine-Borel theorem since outer measure possesses the monotone property. Maybe there is something wrong in my proof?

You don’t know that the outer measure of an open interval $(a,b)$ is $b-a$. If you see, this is exercise 1.6.4. You should first prove exercise 1.6.3 using the definition of outer measure and the compactness of the closed interval. Then you prove exercise 1.6.4 using exercise 1.6.3 and 1.6.2