I have this summation inequality and am not sure how to reduce it: $$\sum_{s=1}^{r+1} \frac{s}{N} \gt \sum_{s=r+1}^{N} \frac{s}{N}$$
We have that $r,s$ range from $1.....N$, where does this inequality change (to $\lt$) as $N \to \infty$? I know the value of $r$ should be $\frac{N}{\sqrt{2}}$ but I am not sure how to show this?
I first tried to use summation expansions but this gave me no luck:
$$\frac{r(r-1)}{2}\gt\frac{N(N+1)}{2}$$ $$N^2+N \lt r^2-r$$
However this leads me no closer and no root two in sight? Could anyone shed any light on this?
Thank you!
We have that in the limit
$$\int_0^t x dx=\int_t^1 x dx \implies \frac12 t^2=\frac12 -\frac12 t^2\implies t=\frac{\sqrt 2}2$$