I have to evaluate $\lim_{n \to \infty}n/(n^2+1) + n/(n^2+2)+n/(n^2+3)+....+n/(n^2+n)$
Now, for the rth term: $\lim_{n \to \infty}n/(n^2+r) = 1/(n+(r/n)) = 0.$ Since this is true for any term, and limits are distributive over addition, the limit should be zero. But this is wrong. Why?
First of all let's get the terminology correct: "distribute" refers to a property of two binary operations such as addition and multiplication, $$a(b+c)=ab+ac\ .$$ Limits do not fall into this category. You might say that "limits preserve addition" or "limits respect addition" or "the limit of a sum is the sum of the limits".
What this means, more formally, is that $$\lim(a_n+b_n)=\lim a_n+\lim b_n\ .$$ To put it even more carefully,
You can easily extend this to the limit of a sum of three terms, or four, or any fixed number. However, in your example you have a sum of $n$ terms, which is not a fixed number - it increases as $n$ increases. So you have a situation in which your terms are getting smaller and smaller, but you are adding up more and more of them: so it should make sense intuitively that the sum might not tend to zero.
An example which works in much the same way as yours but with easier calculations: $$\eqalign{ \lim_{n\to\infty}\Bigl(\frac{n}{n^2}+\frac{2n}{2n^2}+\frac{3n}{3n^2}+\cdots +\frac{n.n}{n.n^2}\Bigr) &=\lim_{n\to\infty}\Bigl(\frac{1}{n}+\frac{1}{n}+\frac{1}{n}+\cdots +\frac{1}{n}\Bigr)\cr &=\lim_{n\to\infty}1\cr &=1\ ,\cr}$$ even though each term inside the brackets tends to zero as $n\to\infty$.