Consider a strictly increasing, absolutely continuous function $f: [0,1] \to \mathbb{R}$.
Prove that for any $G_{\delta}$ set $G \subseteq [0,1]$, that the Lebesgue measure $m$ of $f(G)$ is given by
$$ m(f(G))=\int_{G}f^{'} dm $$
MY THOUGHTS:
Knowing $f$ is absolutely continuous means we can use a version of the fundamental theorem of Lebesgue Integration. If we pretend $G$ is an interval, letting $c$ be the "left endpoint" of $G$ and $d$ be the "right endpoint" of $G$, then $f(d)=f(c)+\int_{[c,d]}f^{'}dm$ which implies $$0 < m(G) = f(d)-f(c) = \int_{[c,d]}f^{'}dm $$ But the issue is that $G$ might not be an interval, but instead a countable union of intervals, so the proof doesn't exactly follow through..
UPDATE (is this correct?):
Say WLOG our $G_{\delta}$ set $G$ ends up being a countable union of disjoint intervals $G = \bigcup_{n=1}^{\infty}G_{n}$. Then on each $G_n = [a_n, b_n]$, we have $m(f(G_{n}))=\int_{G_{n}}f^{'} dm$ by my argument above. Then
$$ \int_{G}f^{'} dm = \int_{\bigcup_{n=1}^{\infty}G_{n}}f^{'} dm = \sum_{n=1}^{\infty}\int_{G_{n}}f^{'} dm = \sum_{n=1}^{\infty} m(f(G_n)) = \sum_{n=1}^{\infty} f(b_n)-f(a_n) = m(f(G))$$
Sketch of proof.
Consider the family of Borel sets $\mathcal G = \{G\subset\mathcal B([0,1]) : m(f(G)) = \int_G f'\,dm \}$.