Suppose $\lbrace\gamma_n\rbrace_{n\in\mathbb{N}}$ is a sequence of real numbers such that $\gamma_n>1$ for all $n$ and $\liminf\gamma_n=1$.
Under which conditions (if any) does
$$
\sum_{n=1}^{\infty}\frac{1}{n^{\gamma_n}}
$$
converge?
Initially I was thinking about an arbitrary sequence $\lbrace \gamma_n\rbrace_{n\in\mathbb{N}}$, but it is easy to see that if $\lbrace \gamma_n\rbrace_{n\in\mathbb{N}}$ is eventualy bounded below by a constant bigger than $1$ then the sum converges, on the other hand if $\gamma_n<1$ for infinitely many $n$, then it diverges. So (if I am not wrong) we can consider just sequences eventualy bigger than $1$(and because a finite amount of terms does not affect the convergence we can consider sequences whose terms are all bigger than $1$) whose limit inferior is $1$.
I think that maybe it is worth noting that if we have $\liminf\alpha_n=1$ then we can, inductively, construct $\lbrace \alpha'_n\rbrace_{n\in\mathbb{N}}$ such that $\lbrace \alpha'_n\rbrace_{n\in\mathbb{N}}$ is decreasing, $\lim\limits_{n\to\infty}\alpha'_n=1$ and $\alpha'_n\leq \alpha_n$ for all $n$. Thus it is sufficient to study the convergence of $\sum_{n=1}^\infty\frac{1}{n^{\gamma_n}}$ when $\lbrace\gamma_n\rbrace_{n\in\mathbb{N}}$ is a decreasing with limit $1$.