Analyse the convergence of $$a_n=\frac{1}{n^2}+\frac{1}{(n+1)^2}+\dots+\frac{1}{(2n)^2}$$ and find the point of convergence if it does converge.
I proved that it converges to $0$ noting that $$\frac{1}{n^2} \le a_n \le (n+1)\times \frac{1}{n^2}$$ and using Sandwich Theorem.
But I was following a book by Arihant which gives the following solution (and I am quoting it without changing any notations)
Define $a_n=\frac n{(n+n)^2}$ so that $\lim_{n\to \infty} a_n=0$. Now, use Cauchy's Theorem to get $$0=\lim_{n\to \infty} \left(\frac{a_1+\dots a_n}n\right)=\lim_{n\to \infty} \left(\frac{1}{(n+1)^2}+\dots \frac 1{(2n)^2}\right)$$ and hence the result follows.
I believe this solution is absolutely rubbish (although it would be good to have a check). But, I do like the idea and the approach. Can this approach (with necessary modifications) be used to solve these kind of limits?
As it's given that, $a_{n}=\frac{1}{n^2}+\frac{1}{(n+1)^2}+...\frac{1}{(2n)^2}$
To make it more understandable, we can write $a_{n'}$ instead of $a_{n}$ where $n'=0,1,....n$
So, we can write $a_{n'}$ as,
$a_{n'}=\frac{1}{n^2}+\frac{1}{(n+1)^2}+...\frac{1}{(2n)^2}=\sum_{n'=0}^{n}\frac{1}{(n+n')^2}$
$\lim_{n\rightarrow\infty}a_{n'}$=$\lim_{n\rightarrow\infty}(\sum_{n'=0}^{n}\frac{1}{(n+n')^2}$)
From here, we can easily interpret that,
$\lim_{n\rightarrow\infty}(\sum_{n'=0}^{n}\frac{1}{(n+n')^2}$)=0
So, $\lim_{n\rightarrow\infty}a_{n'}$=0.