Why does a singular distribution have its derivative zero almost everywhere?

561 Views Asked by At

Let $m$ be the Lebesgue measure ans $F:\mathbb{R}\rightarrow \mathbb{R}$ be a monotonically increasing continuous function such that $\lim_{x\to -\infty} f(x)=0$ and $\lim_{x\to \infty} f(x)=1$.

Let $m_F$ be the Lebesgue-Stieltjes measure associated with $F$.

If $m_F$ is singular with respect to $m$ then $F’=0$ $m$-a.e.? How do I prove this? Any reference?

Since $F$ is monotonically increasing, $F$ is differentiable $m$-a.e. So $F’$ is well-defined here. Since $m_F$ is singular with respect to $m$, there exists a Borel set $A$ such that $m_F(A)=0$ and $m(A^c)=0$. I think this implies that $F$ is almost every locally constant on $A$, but I cannot prove this. How do I prove this?

My definition for “$m_F$ is singular with respect to $m$” is that there exists a Borel set $A$ such that $m_F(A)=0$ and $m(A^c)=0$.

1

There are 1 best solutions below

2
On BEST ANSWER

Suppose $F'>0$ on some $A$ with $m(A)>0$, one can show that then $m_F(A)>0$. So $m_F$ and $m$ are not singular wrt to one another. To prove that intermediate statement, "foliate" $A$ into $A_n$ with $F'>1/n$ on $A_n$. One of the $A_n$ must have positive $m$-measure and for this one the desired inequality is easily seen.