Consider a sequence bn (containing infinite amount of values) with values in ℝ, indexed by ℕ. Suppose the range of bn is contained in the finite interval [a, b].
Use the Principle of Mathematical Induction to prove that for any n ∈ ℕ, there is an interval of length (b-a)⁄2n, k ≥ 1, which contains infinitely many terms of the sequence
I can understand it intuitively, bn has infinite terms, so there exists a subset of bn that also has infinite terms. However, I am not sure how I would go about predicting this inductively.
Base case $n = 0$
The interval $(a,b)$ has infinitely many members of $\{b_n\}$.
Inductive hypothesis:
Suppose there is an interval of lenght $\frac {(b-a)}{2^n}$ that contains infinitely many points. Lets call this interval $(c,d)$
We must show that if when the hypothesis is true it implies that there is an interval of length $\frac {(b-a)}{2^{n+1}}$ that contains infinitely many points.
$(c, \frac {c+d}{2})$ and $(\frac {c+d}{2}, d)$ are sub-intervals of (c,d) each or length $\frac {(b-a)}{2^{n+1}}$
At least one of $(c, \frac {c+d}{2})$ and $(\frac {c+d}{2}, d)$ has infinitely many points. i.e. if neither do, then $(c,d)$ does not have infinitely many points, and that would contradict our assumption.
QED