As a deeply enthusiastic prospective undergraduate student, there are is a fact that i'm still to completely understand about the big $O$ notation, namely:
Let $f(x), g(x) \neq x$ be nonconstant differentiable functions with $f(x), g(x) = O(x)$. Does it necessarily follow that
$\dfrac{f(x) - x}{f(x) - g(x)} = \dfrac{O(x) - x}{O(x)} = O(1)$ ?
My attempt:
Since $f(x), g(x) = O(x)$, it follows that $f'(x), g'(x) =O(1)$, hence by the L'-Hopital Rule, we have $\lim_{x \to \infty} \dfrac{f(x) - x}{f(x) - g(x)} = \dfrac{O(1)}{O(1)}=O(1)$, as required ?
For an explicit counterexample, consider $f(x) = 2x$ and $g(x) = 2x + 1/x$. Both are $O(x)$. Then $f(x) - x = x$ and $f(x) - g(x) = 1/x$. So then $$\frac{f(x) - x}{f(x) - g(x)} = \frac{x}{1/x} = x^2$$ which is much larger than $O(x)$!
This is a good intuition-building exercise on why you must be careful when using multiple big-oh expressions. For instance, having a big-oh result for a function in a denominator is not particularly useful for making upper bounds, as one wants lower bounds for functions in denominators to generate upper bounds for the overall expression.