I have a sequence $a_n$ for $n\in\mathbb{N}$ such that $\lim_{n\rightarrow\infty}\frac{a_n}{\log n} = 0$.
Let $poly(n) := \sum_{i\in S} c(i) n^i > 0$, where $S$ is some finite subset of real numbers and $c(i)$ are real numbers. Does it hold that
$$\lim_{n\rightarrow\infty}\frac{a_n}{\log \left(poly(n)\right)} = 0$$
My idea to prove this is to bound the polynomial with the largest and smallest terms. There exists some $r, s\in \mathbb{N}$ such that $n^s\geq poly(n) \geq n^r$. Using the monotonicity of the logarithm to obtain
$$ \frac{a_n}{\log \left(n^s\right)} \leq \frac{a_n}{\log \left( poly(n)\right)} \leq \frac{a_n}{\log \left(n^r\right)}$$
By the sandwich rule, the limit in the middle goes to zero since
$$\lim_{n\rightarrow\infty} \frac{a_n}{r\log n} = 0, \lim_{n\rightarrow\infty} \frac{a_n}{s\log n} = 0$$
Is this proof correct or is there some sequence $a_n$ for which this argument fails?