I am trying to work through Information Theory, Inference, and Learning Algorithms by David MacKay.
However, I am confused by his notation
$$ \simeq. $$
For example, on the very first page he uses it to state a form of Stirling's approximation: $$ x! \simeq x^x \exp(-x). $$
From this statement, I know that he cannot mean asymptotically equal to, used as $$ f(x) \sim g(x)$$ to mean $$\lim_{x \to \infty} \dfrac{f(x)}{g(x)} = 1,$$ as Stirling's approximation is in fact $$x! \sim x^x \exp(-x) \sqrt{2 \pi x}.$$
My working hypothesis is that $$ f(x) \simeq g(x)$$ means something like $$\lim_{x \to \infty} \dfrac{\log(f(x))}{\log(g(x))} = 1,$$ but I have tried to look online and haven't found any confirmation. I was curious if anyone more familiar either with this book or asymptotic notation could clarify?