Partially into the derivation, the author substitutes the result of this infinite series,
$$ \sum_{h=0}^\infty hx^h = \frac{x}{(1-x)^2} $$
into the bounded sum,
$$\sum_{j=0}^h j\left(\frac{1}2\right)^j.$$
The author says "since the infinite series is bounded, we can use it instead as an easy approximation." I don't understand why. Can I generally do this?
Since the terms in the series are nonnegative and the partial sums are bounded above, we can conclude that each partial sum is bounded above in particular by the least upper bound - the limit of the partial sums, which is simply the infinite series.
More precisely, I am referring to the following theorem:
For if the set $\{a_n:n=1,2,\ldots\}$ is bounded above, it has a least upper bound $a$, and given $\varepsilon>0$ we may choose $N$ so that $n\geqslant N$ implies $a-\varepsilon<a_n<a$ and hence $\lim_{n\to\infty}a_n=a$. Conversely, if $\{a_n\}$ is unbounded, then for any $a\in\mathbb R$, there exists $n$ such that $n\geqslant N$ implies $a_n-a >1$, so $a_n$ does not converge to $a$.
Now, given a sequence $\{a_n\}$, there is the corresponding sequence of partial sums $\{S_n\}$ where $$S_n = \sum_{k=1}^n a_k. $$ If $a_k\geqslant 0$ for all $k$, then $S_1=a_1$ and for each $n\geqslant0$, $$S_{n+1}=S_n + a_{n+1}\geqslant S_n, $$ so $\{S_n\}$ is a monotonically increasing sequence. It follows that $\lim_{n\to\infty} S_n$ exists if and only if $\{S_n\}$ is bounded.