Let me motivate my question with an example. Suppose I have two finite series $A$ and $B$. Series $A$ has the following values: $-1/150, -1/149, ... , -1/141$. Series $B$ has the following values: $-1/100, -1/99, ... , -1/91$. Calculating the sample variance for each series using any statistical software like R reveals that the Variance of Series $A <$ the Variance of Series $B$. But is this true more generally for finite series?
Specifically, let Series $A$ have the following values: $-1/(b+k), -1/(b+k-1), -1/(b+k-2), ... , -1/(b+k-9)$. Let Series $B$ have the following values: $-1/b, -1/(b-1), -1/(b-2), ... , -1/(b-9)$. How do I prove algebraically that the variance of Series $A$ is lower than the variance of Series $B$? Both $b$ and $k$ are positive constants.
If the denominator in Series $A$ declines at a slower rate than the $-1$ rate in the denominator of Series $B$, can I show that the above result remains true?
I feel almost stupid almost asking this because I did take quite a bit of probability and statistics in college but I've tried to look everywhere for some help on this and haven't found anything. Thanks!