Proof — Weierstrass Approximation Theory for derivatives

428 Views Asked by At

I'm working through the second edition of Abbott's Understanding Analysis, and I'm stuck on the following (6.7.11):

Assume that f has a continuous derivative on $[a, b]$. Show that there exists a polynomial $p(x)$ such that $|f(x) − p(x)| < ε$ and $|f'(x)-p'(x)| < ε$ for all $x \in [a, b]$.

I know that $|f(x) − p(x)| < ε$ follows directly from the WAT, but I'm not sure about the second part. I believe the differential limit theorem may be useful, but I can't quite incorporate it in. Thanks!

2

There are 2 best solutions below

6
On

This can be done without using integration, in a more basic way. Actually, that's probably what Abbott meant since this problem is given before even introdusing the concept of integration in his book.

Since $f'$ is continuous, $\exists$ a polynomial $q'(x)$ (which is the derivative of some other family of polynomials $q(x) + C$, where $q(x)$ represents some polynomial and $C$ is a constant) such that: $$\vert f'(x)-q'(x)\vert \le \epsilon \iff -\epsilon\le f'(x)-q'(x)\le\epsilon $$ ($\le$ sign will just make it easier later and these are equivalent anyway). Consider $p(x)=q(x) - q(a)+ f(a)$. Notice that $p'(x)=q'(x)$.

Now let $g(x)=\epsilon (x-a)-(f(x)-p(x)).$ $$g(a)=0 \land g'(x)=\epsilon-(f'(x)-p'(x))=\epsilon-(f'(x)-q'(x))\ge0$$ $$g(a)=0\land g'(x)\ge0\implies g(x)\ge0\,\,\forall x\in [a,b]$$(since $g'(x)\ge 0\implies g $ is increasing). $$g(x)= \epsilon (x-a)-(f(x)-p(x))\ge0\implies f(x)-p(x)\le\epsilon(x-a)\le\epsilon(m+\vert a \vert),$$ where $m=max(\vert a \vert,\vert b \vert).$

Using the same method we can show that $-\epsilon(m+\vert a \vert)\le f(x)-p(x)$. Thus: $$-\epsilon(m+\vert a \vert)\le f(x)-p(x)\le \epsilon(m+\vert a \vert)\iff \vert f(x)-p(x) \vert \le \epsilon(m+\vert a \vert),$$ which proves our goal.

In case you're bothered with some constant multiple of $\epsilon$, we could heve let $\vert f'(x)-q'(x)\vert \le \frac{\epsilon}{m+\vert a \vert}$, but this does not change anything since the very mathematical idea behind this all is still that we can make the difference between the function and the polynomial to be less than any given positive real number and we can thus approximate $f$ by $p$ uniformly.

0
On

I was trying to solve this problem as well. This my attempt, which makes use of one of theorems from an earlier section in the chapter. It's more convoluted than the previous answer though.

Since $f$ is continuously differentiable on $[a,b]$, by the WAT, $\forall \epsilon >0$ there exists a polynomial $p$ such that $\forall x \in [a, b], |f'(x) - p(x)| < \epsilon$. Thus there exists a sequence of polynomials $(p_n)$ that uniformly converges to $f'$. Since $(p_n)$ is countable, we can find a sequence of polynomials $(q_n)$ such that $q_n(a) = f(a)$ and $q_n' = p_n$ for all $n$. Thus the sequence $(q_n(a))$ trivially converges to $f(a)$.

There is an extension of the Differentiable Limit Theorem in Abbott's book (Theorem 6.3.3) that states:

Let $(f_n)$ be a sequence of differentiable functions defined on the closed interval $[a,b]$, and assume $(f'_n)$ converges uniformly to a function $g$ on $[a,b]$. If there exists a point $x_0 \in [a,b]$ for which $f_n(x_0)$ is convergent, then $(f_n)$ converges uniformly. Moreover, the limit function $f= \lim f_n$ is differentiable and satisfies $f' = g$.

The sequence $(q_n)$ satisfies the hypothesis of this theorem because $(q_n)$ converges for at least one point in $[a,b]$, and the sequence of derivatives $(q'_n)$, where $\forall n, q'_n = p_n$, is uniformly convergent. By construction $\lim p_n = f'$, and so the theorem implies $(q_n)$ uniformly converges to some $h$, where $h' = f'$. Thus $h$ and $f$ have the same derivative. Since properties of the integral can't be used (this is before the integration section in the book) , let's use the extended/generalized Mean Value Theorem (MVT). Since $f'=h'$ and by construction $f(a)=h(a)$, the generalized MVT implies $f = h$.

Thus on $[a,b]$ , $(q_n)$ is a sequence of polynomials uniformly converging to $f$, and $(q'_n)$ uniformly converges to $f'$. So by definition of uniform convergence, $\forall \epsilon >0$, we can find some $m$ such that $\forall x\in [a,b], |q_m(x) - f(x)| < \epsilon$ and $|q'_m(x) - f(x)| < \epsilon$.