I've read in Wikipedia:
By stopping once the desired precision is reached, floating-point numbers can be approximated to arbitrary precision. If a real number x is approximated by any rational number a/b that is not in the sequence of mediants found by the algorithm above, then the sequence of mediants contains a closer approximation to x that has a denominator at most equal to b; in that sense, these mediants form the best rational approximations to x.
Suppose you have a rational $0 < q = \frac{a}{b} < 1$ and you want to find $\tilde q = \frac{\tilde a}{\tilde b}$ such that $\Delta = |q- \tilde q| < \frac{1}{n}$. How long deep would you have to descent in the tree?
By looking at the tree, I got:
- without any decent ($d=0$), you can guarantee $\Delta < 1$
- $d=1 \Rightarrow \Delta < \frac{1}{2}$
- $d = 2 \Rightarrow \Delta < \frac{1}{3}$
- $d = 3 \Rightarrow \Delta < \frac{1}{4}$
So it looks very much like:
$d \Rightarrow \Delta < \frac{1}{d+1}$. I guess this is plausible, as the worst case seems to be something close to $0$.
But is there a way to get a better approximation? I mean, suppose you have to approximate $q_1 = 0.7320508$. Then you would get these results:
- $\frac{1}{1}$, $\Delta = 0.26794920$
- $\frac{1}{1}$, $\Delta = 0.26794920$
- $\frac{1}{2}$, $\Delta = 0.23205080$
- $\frac{2}{3}$, $\Delta = 0.06538413$
- $\frac{3}{4}$, $\Delta = 0.01794920$
- $\frac{5}{7}$, $\Delta = 0.01776509$
- $\frac{8}{11}$, $\Delta = 0.00477807$
- $\frac{11}{15}$, $\Delta = 0.00128253$
- $\frac{19}{26}$, $\Delta = 0.00128157$
- $\frac{30}{41}$, $\Delta = 0.00034348$
- $\frac{41}{56}$, $\Delta = 0.00009206$
- $\frac{71}{97}$, $\Delta = 0.00009204$
- $\frac{112}{153}$, $\Delta = 0.00002466$
- $\frac{153}{209}$, $\Delta = 0.00000662$
- $\frac{265}{362}$, $\Delta = 0.00000660$
- $\frac{418}{571}$, $\Delta = 0.00000176$
- $\frac{571}{780}$, $\Delta = 0.00000048$
- $\frac{989}{1351}$, $\Delta = 0.00000047$
- $\frac{1560}{2131}$, $\Delta = 0.00000012$
But when I have $q_2 = \frac{1}{2} + \frac{1}{100000}$ I know it will take VERY long until I get an improvement. Intuitively, I'd say it would take something about $\frac{100000}{2} + 1$ steps until the approximation gets better.
Question
Given a real number $q = \frac{a}{b}$ and an approximation $\tilde q = \frac{\tilde a}{\tilde b}$ in step $d$, is there any better guarantee for approximations in step $d+i$ with $i > 0$?
Or is there any way to tell after how many steps an improvement will happen at all?