Approximating a function using its integral

81 Views Asked by At

Question: Let $f:\Bbb R \to \Bbb R \in C^{1}, \forall \delta>0:$ $$F_\delta = \frac 1{2\delta}\int^{x+\delta}_{x-\delta} f(t) \, d(t)$$

in $[a,b]$ prove that $\forall \varepsilon>0 \exists \delta>0\text{ s.t. }\forall x\in[a,b] |F_{\delta}(x)-f(x)|<\varepsilon$

What we did F is the antiderivative of f. We know it exists because a continuous function is Riemann integrable. based on Newton Leibnitz, the basic theorem of integral math and Lagrange theorem: $|F_\delta(x)-f(x)|=| \frac 1{2\delta}(F(x+\delta)-F(x-\delta)-f(x))|$ for some $c\in [x+\delta,x-\delta]$

$=|F'(c)-f(x)|=|f(c)-f(x)|<\varepsilon$ from the fact that $f(x)$ is continuous then when $|x-c|<\delta \Leftarrow |f(x)-f(c)|<\varepsilon$

Is this solution correct?