Show that the series
$$\sum_{n=1}^{\infty}\dfrac{1}{\ln^{k}(n+1)}$$
diverges for each $k\in \mathbb{N}$.
To try to solve the problem I have used the integral criterion but it did not result, so I used Raabe's criterion and arrived at the calculation of the following limit
$$\lim_{n\rightarrow \infty} n\left (1 -\frac{\ln^{k}(n+1)}{ln^{k}(n+2)} \right)$$
which numerically converges to zero for any $k \in \mathbb{N}$ but I don't really know how to formally argue how to argue that convergence. What would be the suggestion to argue the convergence of the previous limit? Or if there is some other way to solve the original problem, what would be the suggestion to solve the problem?
I was also going through some books and I think I could see the divergence of the series using infinite products but I have no idea how to use infinite products to see the divergence of the series.
Note that, by L'Hopital's Rule\begin{align}\lim_{x\to\infty}\frac x{\log^k(x)}&=\frac1k\lim_{x\to\infty}\frac x{\log^{k-1}(x)}\\&=\frac1{k(k-1)}\lim_{x\to\infty}\frac x{\log^{k-2}(x)}\\&{}\qquad\vdots\\&=\lim_{x\to\infty}x\\&=\infty.\end{align}So, if $n$ is large enough you have $\frac x{\log^k(x)}>1$, which means that $\frac1{\log^k(x)}>\frac1x$. So, since the harmonic series diverges, so does your series.