Applying Taylor's theorem for a smooth function

59 Views Asked by At

I'm solving the following problem: Consider the finite difference approximations $fv(x) = (v(x+h)-v(x))/h \approx v'(x)$. Use Taylor's theorem to show that if v is sufficiently smooth on some interval then $[x-h_0, x+h_0]$, $|v'(x) - fv(x) \leq M(v)h|$, $0<h<h_0$. State Clearly smoothness assumption required for my proof to be valid and detail the dependence of $M(v)$ on v and $h_0$.

I need help for using the Taylor's expansion: So, $v(x) = v(a) +v'(a)(x-a) + v''(a)(x-a)^2/2! + ... + v^k(x-a)^k/k! + hk(x)(x-a)^k$. I have trouble with the arguments . Secondly, assuming that I expanded this, should I bring v' on one side and subtract the fv(x) from it?

Other thing is that what's the significance of v being smooth?

1

There are 1 best solutions below

2
On

For $a\in[x-h_0,x+h_0],$

$$v(a+h)=v(a)+hv'(a)+\frac{h^2}{2}v''(a+th)$$

with $0<t<1$ and $0<h<h_0$.

let $$M=\sup_{a\in[x-h_0,x+h_0]}|v''(a)|$$ then

$$|v'(a)-fv(a)|\le h\frac{M}{2}$$