Suppose the series $$\sum{a_n}$$ diverges where $a_n\ge 0$ and the sequence is monotone non-increasing. If exactly one element is chosen from each interval of size $k$ -- i.e., one element from $[a_0,a_1,...,a_{k-1}]$, one element from $[a_k,...,a_{2k-1}]$, etc. -- must this series diverge? Must $$\sum{a_{n_i}}=\infty,\;\;\;n_i\in[ik,(i+1)k)$$
2026-03-28 22:25:51.1774736751
On
Another Divergent Series Question
80 Views Asked by user142299 https://math.techqa.club/user/user142299/detail At
2
There are 2 best solutions below
0
On
Clearly there is some choice of elements from each interval for which the sub-series diverges, or else the overall series would converge. Now note that for any choice of element, you can bound that element as being greater than or equal to the element that was chosen from the next interval. Thus, for any choice of elements from intervals of size $k$, you get that the summation is greater than or equal to the sum of the $2$nd to $\infty$ term of the the divergent choices of elements from intervals of size $k$. Thus all choices of elements lead to a divergent subseries.
Since the sequence is non-increasing, we have
$$a_{n_i} \geqslant a_{(i+1)k-1} \geqslant \frac{1}{k} \sum_{m=(i+1)k}^{(i+2)k-1} a_m,$$
whence
$$\sum_{i=0}^\infty a_{n_i} \geqslant \frac{1}{k}\sum_{m=k}^\infty a_m.$$