Absolutely continuous on every closed interval iff $\int_{\mathbb{R}}f'(t)dt = 1$

275 Views Asked by At

Let $f: \mathbb{R}\rightarrow \mathbb{R}$ be a non-negative increasing function with $$\lim_{t\to-\infty}f(t) = 0, \lim_{t\to\infty}f(t) = 1$$

Prove that $f$ is absolutely continuous on every closed interval iff $\int_{\mathbb{R}}f'(t)dt = 1$.

The forward direction isn't any problem. If $f$ is absolutely continuous on every closed bounded interval, then I can just apply the fundamental theorem of calculus to $f$ on $[-n,n]$ and then take the limits.

How could I go about proving the backwards direction though? I thought about trying to prove that $f$ must be an indefinite integral, but I'm still a bit stuck.

But all I can come up with for that approach is to first define $h(x) = \int_a^x f'(t)dt$, for some fixed $a$. Then $h'(x) = \frac{d}{dx}\int_a^x f'(t)dt = f'(x)$. I'm not really sure where to go from here. I'm tempted to say that, because the derivatives of the two functions are equal, the functions themselves must differ by a constant almost everywhere, and since $h$ is an indefinite integral it would then follow that $f$ is an indefinite integral and thus absolutely continuous. But this makes no use of the assumption that $\int_{\mathbb{R}}f'(t)dt = 1$, so I assume I'm making a mistake somewhere.

Any thoughts would be greatly appreciated.

Thanks in advance.

2

There are 2 best solutions below

0
On

Hint: Use Fatou's Lemma and definition of derivative to show that $f(b)-f(a) \leq \int_a^{b} f'(t)dt$ whenever $a <b$. Now use the fact that $f'$ is integrable so given $\epsilon >0$ there exists $\delta >0$ such that $\int_E f'(t)dt <\epsilon$ whenever $m(E) <\delta$. Absolute continuity of $f$ on finite intervals now follows easily.

0
On

For the converse, we have already that $\int_{a}^{b} f' \leq f(b)-f(a) $ when $a < b$ from the monotonicity of $f$. So now, assume for a contradiction that $\int_{a}^{b} f' < f(b) -f(a)$. \begin{align*} 1 &= \int_{-\infty}^{a} f' + \int_{a}^{b} f' + \int_{b}^{\infty} f'\\ &< f(b) - f(a) + \int_{-\infty}^{a} f' + \int_{b}^{\infty} f' \\ & \leq f(b) - f(a) - \lim_{t \to -\infty} f(t) + f(a) + \lim_{t \to \infty} f(t) - f(b) \\ &= 1 \\ \end{align*} Therefore, $\int_{a}^{b} f' = f(b) - f(a)$.