Help in Measuring Error on Estimates of Differential Equations

88 Views Asked by At

I am working on a project for class where I have to estimate the solutions to a damped harmonic oscillator ($x''+2 \gamma x'+ \omega^2 x=0$) and compare three methods for doing so (Third Order Runge-Kutta, Conformal Explicit Leap-Frog, Conformal Implicit Midpoint)

Part of this is comparing what error looks like at each step (using the easily obtainable exact solution) but I am having trouble coming up with a way to measure error. Normally there is always the fallback on real error, but this is the first project I've done where the solution has a possibility of being 0 (such that real error would explode when the solution is near zero) so that is not a possibility. Absolute error doesn't help either since the solution is going to 0, the graph of error looks like the dampened harmonic graph itself.

So I am wondering what might be a good way to calculate error? I am working in MATLAB and have been given the code for all 3 methods so I am confident they work. I know that Runge-Kutta should be better than the other two in shorter intervals, but the other two are better in the long run.

Thanks for any help

1

There are 1 best solutions below

0
On

Since you know that the solution is decaying, of the order $e^{-\gamma t}$, a natural way to measure the error on some interval $a\le t\le b$ is $$ \sup_{a\le t\le b} e^{\gamma t} |x(t) - \tilde x(t)| $$ where $\tilde x$ is the approximation. In other words, divide not by the exact solution itself (which oscillates across $0$), but by the function that expresses the magnitude of oscillation.