As far as I understand, the Monotonic Sequence Theorem states that if a sequence is monotonic and the individual terms are bounded, then the sequence is convergent.
My book states that $\lim \limits_{t \to \infty}\sum_{n=1}^t b^{\ln n}$ is convergent only for b $\lt$ $\frac{1}{e}$. However, is $\lim \limits_{t \to \infty}\sum_{n=1}^t 0.5^{\ln n}$, for example, not a monotonically decreasing series, whose terms are bound by 0 below and 1 above? Therefore this series meets the criteria for the MST, yet diverges, and does not share the outcome predicted by the theorem.
Why is $\lim \limits_{t \to \infty}\sum_{n=1}^t 0.5^{\ln n}$ divergent when it appears to meet the Monotonic Sequence Theorem's criteria for convergence?
You've confused sequence convergence (the $n$th term has a finite $n\to\infty$ limit) with series convergence (the sum of the first $n$ terms has a finite $n\to\infty$ limit).