Rate of convergence from asymptotic limit

49 Views Asked by At

I am working on an algorithm in which one parameter $a(t)$ verifies the following asymptotic limit: $f(a(t))=\mathcal{O}(1/t^{2})$. I am using it to assess an empirical rate of convergence of $f(a(t))$ as $\frac{f(a(t))}{f(a(10t))}\approx100$. It works more or less well, but I know that mathematically it is not formal. Do you have any suggestion on how to compute $\frac{f(a(t))}{f(a(10t))}$ from the asymptotic limit?

1

There are 1 best solutions below

2
On

$f(a(t))=O(1/t^2)$ means that $f(a(t))=\frac{g(t)}{t^2}$, $g$ being a bounded function. So $$\frac{f(a(t))}{f(a(10t))} = \frac{\frac{g(t)}{t^2}}{\frac{g(10t)}{100t^2}} = 100\frac{g(t)}{g(10t)}$$ Problem is, you have to know both bounds for $g$ to make this accurate. If you know $$(\forall t)\ 0<A\le g(t)\le B$$ which is quite common in algorithms theory, then $$\frac{f(a(t))}{f(a(10t))} \le \frac{100B}{A}$$ But in general, you can not find any good bound (and this bound may not exist).