Question about absolute continuous functions (preservation of null sets)

1.5k Views Asked by At

I'm trying to prove that a function $ f:[a,b] \to \mathbb{R} $ is absolutely continuous iff $ \mu(A) = 0 \implies \mu( f(A)) = 0$ for all such $A \subseteq [a,b]$. I'm quite stuck. I'm trying to work from the definition to prove that if f is a.c. then the condition holds, but I am stuck since I can't seem to come up with an appropriate cover for $ f(A)$. I saw on another site that it follows from the fact that f being a.c. is equivalent to $f(x) = f(a) + \int_0^x \phi(t)dt$ where $\phi = f'$. but I can't see how it works. Could anyone provide help? Many thanks.

1

There are 1 best solutions below

6
On

One direct way of dealing with this is using the fact that sets of measure 0 in the line have outer measure 0, which means that given any $\delta$, we can cover the set with open intervals whose total length is less than $\delta$. The definition of absolute continuity then shows that we can make the image have outer measure less than any $\epsilon>0$, so it must have measure 0.