Proving a function decays polynomially

39 Views Asked by At

Let $f:\mathbb{R}\to\mathbb{R}$ be such that $f(x)=O\left(\left(\frac{1}{\log x}\right)^{\lambda}\right)$ as $x\to\infty$ for some constant $\lambda\in\mathbb{R}$. Can we prove that $f$ decays polynomially, that is, $f(x)=O\left(x^{-\mu}\right)$ for some $\mu>0$?

My idea was to expand in terms of a Taylor series, but this doesn't work since we don't have a Taylor expansion for $\log(x)$. If we had, for example, $f(x)=O(\log x)$ we could argue that since $\log x=O(x)$, we get such a result. But with $\frac{1}{\log x}$ there is no such analogue, so I am unsure how to proceed. Any ideas would be great!