Non negative distributional derivative imply a.e. monotonicity

38 Views Asked by At

Suppose $f\in L^1_{loc}(\mathbb{R})$ and that $f'\le 0$ in the sense of distribution, i.e. $\forall \varphi \in C^\infty_C(\mathbb{R})$ $\varphi\ge0$ we have $\int_\mathbb{R}f\varphi'\ge 0$.

How can I prove that $f$ is almost everywhere decreasing ($f(x+y)\le f(x)$ for almost every $x\in \mathbb{R}, y \in \mathbb{R}^+$)?

My obvious attempt was to use the fact that $ \lim_{h\to 0} \int \varphi(x) \frac{f(x)-f(x-h)}{h}dx\le 0$ to get a local statement: for sufficiently small $h$ we have that $f(x)-f(x-h)\le 0$ for a.e. $x,h>0$, where $h$ is small. Can I proceed from this?