$Theorem$ Let $\mu$ be a complex borel measure in $\mathbb{R}$ and $f(x)=\mu((-\infty,x])$.Then $f'(x)=A$ if and only if $\forall \epsilon >0,\exists \delta>0$ such that for every interval $I$ that contains $x$ for which $m(I)<\delta$,we have that $|\frac{\mu(I)}{m(I)}-A|<\epsilon$.
$m$ denotes the lebesgue measure in $\mathbb{R}$
I found this theorem in Real and Complex Analysis of W.Rudin without a proof and i did not manage to find a proof of it somewhere else.
I also tried to prove it my self but i did not come up with anything.
Can someone find a proof for this statement?
Thank you in advance!
Hint
$$f'(x) = \lim_{h \to 0} \frac{f(x+h)-f(x)}{h} = \lim_{h \to 0} \frac{\mu((-\infty,x+h])-\mu((-\infty,x])}{h}\\= \lim_{h \to 0} \frac{\mu([x,x+h])}{m([x,x+h])}$$