We know that $\sum_{n=1}^{\infty}1/n^2$ converges. Why is it allowed to write that $\sum_{n=1}^{N}1/n^2\leq \sum_{n=1}^{\infty}1/n^2$ for all $N\geq 1$? I have seen something like it in many places, and I didn't question it because it seemed "obvious". In general, if $(a_n)$ is an increasing and bounded above, then it has a limit, say $L$. Is it guaranteed that $a_n\leq L$ for all $n\geq 1$?
What I have thought about this question is as follows: Given $\epsilon>0$, there exists $N\in \mathbb{N}$ such that $-\epsilon+L<a_n<\epsilon+L$ for all $n\geq N$. I look at this the inequality $a_n<\epsilon+L$ for all $n\geq N$, which implies that $a_n\leq L$ for all $n\geq N$. But, how about for $n=1,\dots, N-1$? I say that this still holds because of the monotonicity, i.e. we have $a_1,a_2,\dots, a_{N-1}\leq a_N$ for $N\geq 2$.
This is just stating the obvious in formal terms. The basic idea which you have pointed out is correct. An increasing sequence never exceeds its limit. Why? If the increasing sequence $a_n\to L$ then $a_m\leq a_n$ for $n\geq m$. Keeping $m$ fixed and letting $n\to\infty$ we get $a_m\leq L$ and since $m$ is arbitrary the claim follows.
The argument is based on the fact that the operation of taking limits is order preserving. It can weaken the inequality (and most interesting cases fall in this category) but never reverse it.