I am trying to find a simple proof that the series $\sum 1/n^{\sigma+it}$ diverges for $\sigma\leq 0$.
By simple, I mean not requiring knowledge of university level mathematics such as the theorems from Dirichlet series.
For clarity, I mean the series, not the analytic continuation.
My Failed Attempt
Here is my attempt, using $\alpha=-\sigma$ to make reading easier:
$$\sum 1/n^{\sigma+it} = \sum n^\alpha e^{-it\ln(n)}$$
Here $n^\alpha$ clearly grows larger as $n\rightarrow\infty$. On its own, this would be sufficient to prove divergence.
However, the second factor $e^{-it\ln(n)}$ complicates the task as it "rotates" each $n^\alpha$ meaning the magnitude of the partial sums can fall as well as grow.
Even a simpler version of the problem with $\sigma=0$ escapes me.
$$\sum 1/n^{it} = \sum e^{-it\ln(n)}$$
Intuitively I can see that $\ln(n)$ grows slowly enough for the unit-length "vectors" $e^{-it\ln(n)}$ not to rotate enough to cause the partial sum to decrease in magnitude - but I can't turn this intuition into a proof.
If I can prove this simpler problem, then the larger one is solved because
$$\sum 1/n^{\sigma+it} = \sum n^\alpha e^{-it\ln(n)} \geq \sum e^{-it\ln(n)}$$
So if $\sum e^{-it\ln(n)}$ diverges, then so does $\sum 1/n^{\sigma+it}$.
A necessary condition for the convergence of a series $\sum_{n=1}^\infty a_n$ is that $\lim_{n\to \infty} a_n = 0$ (this is sometimes called “term test”).
In the case of the Dirichlet series, $| 1/n^{\sigma+it}| = 1/n^\sigma = n^{-\sigma}$, so that $1/n^{\sigma+it}$ does not converge to zero for $n \to \infty$ if $\sigma \le 0$, and the series is not convergent in that case.