My question is about determining the convergence of a general harmonic series using the integral test. According to the following resource: pg.32 , we can see that for a general harmonic series $H_n^{(r)}=\sum\limits_{n=1}^\infty {1\over n^r}$ if $r\leq0$ then $H_n^{(r)}$ clearly diverges. If $r>0$, then $f_r(x) = {1\over x^r}$ is strictly decreasing for $x\geq1$ and we can use the integral test with $t=1$:
$\int_{1}^{\infty}x^{-r} = \lim_{n\to\infty}\int_{1}^{n}x^{-r} = \lim_{n\to\infty}{{x^{-r+1}}\over{-r+1}}\Big|_1^n = \lim_{n\to\infty}{-1\over{(r-1)n^{r-1}}}+ {1\over{r-1}}$
Okay, now we see that if $r>1$ the left term goes to $0$ and we are left with ${1\over{r-1}}$, and if $0<r<1$ then it diverges. However, what if $r=1$? We get a zero in the denominator of both the left and right term. What does this mean exactly? Or did I just make a mistake in my work?