Let $f\in L^{1}(\mathbb R)$ and $g(x)=\int_{-x}^{x}f(y)dy,$ prove that $g'$ exists a.e and compute it

75 Views Asked by At

The Problem: Let $f\in L^{1}(\mathbb R)$ and $$ g(x)=\int_{-x}^{x}f(y)dy.$$ Prove that $g$ is differentiable almost everywhere and compute $g'$.

My Thoughts: First I restrict to $h>0$ for simplicity and look at the difference quotient to see if it leads anywhere. Now let $\varepsilon>0$ be given. Since $\frac{f}{h}\in L^{1}(\mathbb R)$ for all $h>0,$ then if we fix $h>0$ there is a $\delta>0$ such that $$ \int_{E}\frac{\vert f\vert}{h}<\frac{\varepsilon}{2} \quad\text{whenever}\quad m(E)<\delta. $$

Putting the above together we see that if $0<h$ is small enough, then we have \begin{align*} \Bigg\vert\frac{g(x+h)-g(x)}{h}\Bigg\vert &=\frac{1}{h}\Bigg\vert\int_{-x-h}^{x+h}f(y)dy-\int_{-x}^{x}f(y)dy\Bigg\vert \\ &\leq\int_{x}^{x+h}\frac{\vert f(y)\vert}{h}dy+\int_{-x-h}^{-x}\frac{\vert f(y)\vert}{h}dy \\ &\leq\frac{\varepsilon}{2}+\frac{\varepsilon}{2} \\ &=\varepsilon. \end{align*} It follows that $g'\equiv0$ almost everywhere, since a similar computation yields the same result for the left-derivative.

Note: I used a theorem that says that if $f\in L^{1}(\mathbb R^d)$ then given $\varepsilon>0$, there is a $\delta>0$ such that $\int_{E}\vert f\vert<\varepsilon$ whenever $m(E)<\delta.$


Is my proof correct? I am not confident in the way I worded it, and I think it has a minor mistake, especially with the way I handled the $h$.

Any feedback is much appreciated. Thank you for your time.

2

There are 2 best solutions below

0
On BEST ANSWER

Wrong. The correct answer is $f(x)+ f(-x)$.

Problem in your work? The first statement (abs. continuity). Note that you put there $h$ that has nothing to do with it... The correct statement is:

$$f \in L^1 \mbox{ then for every }\epsilon>0 \mbox{ there exist }\delta(\epsilon)>0\mbox{ such that } \int_E |f| dm < \epsilon\mbox{ whenever }m(E)<\delta.$$

Think of it this way,

$$g(x) = \int_0^x f(t) + f(-t) dm(t).$$

By the FTC (for Lebesgue integrals), $g'(x)$ exists a.e. and its derivative is equal to $f(x) + f(-x)$.

0
On

The problem occurs in this step \begin{align*} \int_{x}^{x+h}\frac{\vert f(y)\vert}{h}\,\mathrm{d}y+\int_{-x-h}^{-x}\frac{\vert f(y)\vert}{h}\,\mathrm{d}y \leq\frac{\varepsilon}{2}+\frac{\varepsilon}{2} . \end{align*} But this is only true if you have for $E_h = [x,x+h]$ that $m(E_h) < \delta$.

In your first statement you said that for a fixed $h$ there is a $\delta$ such that... Hence, $\delta$ depends on $h$. So if you change $h$ such that $E_h$ satisfies the condition $m(E_h)<\delta$, then for your new $h$ you have to take a new $\delta$, but this new $\delta$ might be smaller than $m(E_h)$.