There is an enormous amount of literature on Diophantine approximations, including the general theory of continued fractions, the Stern-Brocot tree, the notion of "badly approximable number" and the Markov constant, the theory of convergents and best rational approximations, and so on.
Of particular interest to me is the general result that the most difficult irrational number to approximate is the golden ratio, as well as the so-called"noble numbers" whose continued fraction expansion is eventually all $1$'s. All of these results typically evaluate the quality of an approximation to some real number $r$, by some rational $n/d$, using the linear absolute distance $|r - n/d|$. Often we will "weight" this distance by multiplying by a factor or two of $d$: for instance, a "badly approximable number" is one in which $|r - n/d| > \frac{c}{d^2}$, given some $c$, for all d, and from this we have the result that the most badly approximable number is $\phi$.
In my situation, I am more interested in the "multiplicative" error in approximating $r$ with $n/d$, meaning the quotient of the two: in particular the quantity $\max\left(\frac{r}{n/d}, \frac{n/d}{r}\right)$ (given that r and n/d are positive). If we like, we can instead take the logarithm of this quantity, sometimes called "logarithmic error," so that we get $|\log(r) - \log(n/d)|$. There are plenty of different ways we can weight either of these metrics by some simple function of $d$, as we did before, so that we can proceed along a similar line of reasoning as with traditional Diophantine approximation theory, only measuring the "multiplicative" rather than "additive" error of some rational approximation.
Question: is there any research showing how some of the usual results may generalize to using the multiplicative error rather than additive error? Do any of the standard results still remain true, or in some sense generalize, if we use this different metric - in particular Hurwitz's theorem?
And most importantly for my purposes: do we still have that the golden ratio, and the noble numbers, are the hardest to approximate? If not, what numbers are?
EDIT: original title was "multiplicative error," which I changed to "logarithmic error" to match the term in this paper: http://dl.acm.org/doi/pdf/10.1145/363235.363263
Here is a partial answer to the question that can be seen only using elementary means.
Let us assume that $r$ is a positive real number. Suppose that $c_i = n_i/d_i$ is the best approximation to some real number $r$ with denominator $i$; e.g. we have $n_i = \text{round}(r\cdot i)$ and $d_i = i$. Then the Markov constant of $r$ is defined as follows:
$$ \limsup_{i\to\infty} \frac{1}{\left|r - \frac{n_i}{d_i}\right|d_i^2} $$
The reciprocal of this quantity is really the elementary thing here; the "weighted error" of approximating $r$ by $\frac{n}{d}$:
$$ \left|r - \frac{n}{d}\right|d^2 $$
Since the convergents are also positive we have $|n/d| = n/d$. So we can rewrite this as
$$ \left|r - \frac{n}{d}\right| \cdot \frac{d}{n} \cdot \frac{n}{d} \cdot d^2 \\ = \left|\frac{r}{n/d} - 1\right| \cdot (n d) \\ $$
As we have $n/d \to r$, the quantity $\frac{r}{n/d} \to 1$, and in this situation we have in the limit that
$$ \lim_{n/d \to r} \left(\frac{r}{n/d}-1\right) = \lim_{n/d \to r} \log\left(\frac{r}{n/d}\right) \\ = \log(r) - \log(n/d) $$
Which is the logarithmic error.
Thus our original expression for the weighted error -- in the limit of small error -- tends to
$$ \left|\log(r) - \log\left(\frac{n}{d}\right)\right| \cdot (n d) $$
So when we are talking about things like the Markov constant, e.g. where we are looking at the limit of a weighted error as things go to infinity, we should get the same results. Thus the golden ratio is the least "logarithmically approximable number" and so on.
On the other hand, I am not quite sure if other results also generalize. For instance, are the "best rational approximations" using linear weighted error also the best using logarithmic weighted error? What does this mean regarding convergents, semiconvergents, and so on?