Define the sequences $\{a_n\}$ and $\{b_n\}$ as $$a_n=\dfrac{1}{n^{\alpha}(\log n)^\beta}\quad\text{and}\quad b_n=\dfrac{1}{n^p},$$ for $\alpha>1$, $\beta< 0$ and $p\in(1,\alpha)$. I'm trying to prove that $$\frac{a_n}{b_n}=\dfrac{1}{n^{\alpha-p}(\log n)^{\beta}}\rightarrow0\quad \text{as}\;n\rightarrow \infty,$$ so I can conclude that $\sum a_n$ converges (in virtue of the convergence of $\sum b_n$).
My question is: is that true? What happens if $-\beta$ is too large?
An answer to a similiar question is given in https://math.stackexchange.com/q/2693760, but it is not clear to me why can I bound $(\log n)^{-\beta}$ by $Cn^{\gamma}$ for some constant $C$.
I wrote in a comment "Any positive power of $n$, no matter how small the power is, beats out any positive power of $\log n,$ no matter how large the power is." To verify this, use LHR to show that for any $a>0,$
$$\lim_{x\to \infty} \frac{\ln x}{x^a} = 0.$$
Then note that if $a,b>0,$
$$\frac{(\ln x)^b}{x^a} = \left ( \frac{\ln x}{x^{a/b}}\right )^b.$$
This $\to 0$ by the first result.
You will be able to solve your problem from this result.