Asymptotic error expansion of global error for single step methods

148 Views Asked by At

My question refers to the proof of the following theorem, but it may suffice to just skip the theorem and continue with the problematic taylor expansion $(\ast)$:

Let $f(t,y)$ and the single step method $\varphi(t,u,h)$ be sufficiently often continuously differentiable and let the local error have an expansion of the form $$ le(t,h) = d_{p+1}(t)h^{p+1} + d_{p+2}(t)h^{p+2} + \cdots + d_{N+1}(t)h^{N+1} + \mathcal{O}(h^{N+2}). $$ Then, the global error $e_h(t)$ after $n$ steps with size $h$ at $t^* = t_0 + nh$ has an asymptotic expansion $$e_h(t^*) = e_p(t^*)h^p + e_{p+1}(t^*)h^{p+1} + \cdots + e_N(t^*)h^N + E_{N+1}(t^*,h)h^{N+1},$$ where $E_{N+1}(t^*,h)$ for $0<h\leq h_0$ is bounded.

In the proof, a Taylor expansion is carried out

$$\varphi(t,y(t),h) - \varphi(t,y(t) - e_p(t)h^p,h) = \frac{\partial}{\partial y} \varphi(t,y(t),h)e_{p}h^p + \mathcal{O}(h^{2p}) \tag{$\ast$} $$

which I don't understand. Apparently, this is a Taylor expansion of a function of $h$ at $0$. But if I consider the function

$\varphi \circ g$, where $$g(h) := (t,y(t) - e_p(t)h^p,h)^T$$ then the Jacobian is $$J_g(h) = (0,-pe_p(t)h^{p-1},1)^T$$ wheras the Jacobian of $\varphi$ is $$J_{\varphi}(x,y,h) = (\frac{\partial \varphi}{\partial x}, \frac{\partial \varphi}{\partial y}, \frac{\partial \varphi}{ \partial h})$$ The chainrule would yield $J_{\varphi \circ g} = J_\varphi \cdot J_g$ which does not correspond to the result at $(\ast)$.

Do you have any suggestions on how to interpret $(\ast)$ correctly?

1

There are 1 best solutions below

0
On BEST ANSWER

The Taylor expansion is for $$ φ(t,y(t),h)−φ(t,y(t)−s,h)=\frac{∂}{∂y}φ(t,y(t),h)s+O(s^2) $$ where $s=e_p(t)h^p$ so that $O(s^2)=O(h^{2p})$.