I understand: $$\sum\limits^n_{i=1} i = \frac{n(n+1)}{2}$$ what happens when we restrict the range such that: $$\sum\limits^n_{i=n/2} i = ??$$
Originally I thought we might just have $\frac{n(n+1)}{2}/2$ but I know that's not correct since starting the summation at n/2 would be the larger values of the numbers between $1..n$
No matter what sequence you're adding up (i.e. no matter what $a_i$ is), so long as $m \lt n$ we know that $$\sum^{m-1}_{i = 1} a_i + \sum_{i = m}^n a_i = \sum_{i = 1}^n a_i$$
so we can bring the first term over to the right and side and get
$$\sum_{i = m}^n a_i = \sum_{i = 1}^n a_i - \sum^{m-1}_{i = 1} a_i$$
Can you figure out how to apply this to your situation?