Attempting to rigorously prove the first fundamental theorem of integration for left and right derivatives

222 Views Asked by At

First we define the left and right hand derivatives.

The left derivative of any function $f$ is defined to be $f^{-}(x) = \lim_{h \to 0^{-}} \frac {f(x+h) - \lim_{a \to x^{-}} f(a)}{h}$.

The right derivative of any function $f$ is defined to be $f^{+}(x) = \lim_{h \to 0^{+}} \frac {f(x+h) - \lim_{a \to x^{+}} f(a)}{h}$.

I've been able to prove the addition, multiplication, and (integer) power rules for these two operators. I am fine assuming any previous knowledge or proofs regarding the integral itself. I wish to prove the following statement:

Let $f$ be a piecewise continuous function. Then, if for any real number $a$ we have that $g(x) = \int^{x}_{a} f(t) dt$ then $g(x)$ is left and right hand differentiable, $g^{+}(x) = \lim_{b \to x^{+}} f(b)$, and $g^{-}(x) = \lim_{b \to x^{-}} f(b)$.

The exact reason is because it is needed to patch a hole in one of the proofs here (in fact the only hole I know of aside from me lately habitually getting "-" and "+" backwards in limits):

Piecewise Constant Functions in Differential and Functional Equations

In particular I use it in the first implied derivative proof as quoted here:

By the definition of $f$ being piecewise continuous it has a left and right hand limit everywhere. Therefore for every real number $x$ we have that $F^{+}(x) = \lim_{a \to x^{+}} f(a)$ and $F^{-}(x) = \lim_{a \to x^{-}} f(a)$. Furthermore, since $\lim_{a \to x^{+}} f(a) = f(x)$ or $\lim_{a \to x^{-}} f(a) = f(x)$ we can state that $F^{+}(x) = f(x)$ or $F^{-}(x) = f(x)$. Therefore by the definition of an implied derivative, $f$ is an implied derivative of $F$.

Of course, the link is not self-apparent as that proof was one of the first written in that paper (about a month ago). At that time, the one-sided derivatives were defined using the regular definition. I changed it as it allows it to be defined for functions it otherwise wouldn't while otherwise not changing the value.

2

There are 2 best solutions below

0
On BEST ANSWER

Well, the general result is as follows:

Theorem: Let $f:[a, b] \to\mathbb {R} $ be Riemann integrable on $[a, b] $ and let $F:[a, b] \to\mathbb{R} $ be defined via $F(x) =\int_{a} ^{x} f(t) \, dt$. For any $c\in[a, b] $ if limit $\lim_{x\to c^{+}} f(x) $ exists then the right hand derivative of $F$ at $c$ exists and is equal to this limit. Similar statement applies to the left hand limit of $f$ and left hand derivative of $F$.

The proof is based on the fact that existence of right hand limit (say $L$) of $f$ at $c$ constrains the values of $f$ near $L$ so that for any given $\epsilon >0$ we have a $\delta>0$ for which $$L-\epsilon<f(x) <L+\epsilon$$ for all $x$ with $c<x<c+\delta$. And therefore for $0<h<\delta$ the expression $$F(c+h) - F(c) =\int_{c} ^{c+h} f(x) \, dx$$ lies between $h(L-\epsilon) $ and $h(L+\epsilon) $. Thus $$\left|\frac{F(c+h) - F(c)} {h} - L\right|\leq \epsilon $$ for all $0<h<\delta$. The right hand derivative of $F$ at $c$ is therefore equal to $L$.


Your definitions for left / right hand derivatives are slightly different from the usual ones so that it applies to discontinuous functions also, but it does not matter here as $F$ is guaranteed to be continuous.

0
On

I assume "piecewise continuous" includes the condition that left and right limits exist everywhere. (As an aside: this happens to be the same as requiring bounded variation.)

If you know the result for continuous functions, then you can use it to prove the result for piecewise continuous functions.

For the right derivative:

Fix $f,a,x.$ There is an interval $(x,x+\epsilon)$ where $f$ is continuous. There is a continuous function $f_2:\mathbb R\to\mathbb R$ that is equal to $f$ on $(x,x+\epsilon)$ - just define it to be equal to $\lim_{t\to x^-} f(t)$ on $(-\infty,x]$ and equal to $\lim_{t\to (x+\epsilon)^+}f(t)$ on $[x+\epsilon,\infty).$ Define $g_2(y)=g(x)+\int_x^y f_2(t)dt.$ Then $g$ and $g_2$ are equal on $(x,x+\epsilon)$ because they both map $y$ to $\int_a^yf(t)dt.$ The definition of right derivative at $x$ is determined by the values in $(x,x+\epsilon),$ so the right derivatives of $g$ and $g_2$ are the same.

The left derivative can be proved symmetrically or by considering right derivatives of $t\mapsto f(-t).$