If $d_{n} = \frac{{\beta}_{n}}{10^n}$ where ${\beta}_{n}$ takes integer values between 0 and 9, does $\sum_{n=1}^{\infty}d_{n}$ converge?

54 Views Asked by At

The series looks like a converging geometric series: $$\sum_{n=1}^{\infty}\frac{1}{10^n} = \sum_{n=0}^{\infty}\frac{1}{10^n} - 1 = \frac{1}{9}$$ Where the terms are being arbitrarily multiplied by constants between $0$ and $9$. I'm not sure about how this affects the convergence of a series. I suspect that it converges because the constants eventually become small in comparison to the value of the geometric terms, but I'm not sure how I could prove it.

The "worst case" would be one where $\beta_{n} = 9$ for all $n$. In that case:

$$\sum_{n=1}^{\infty}d_{n} = 9\sum_{n=1}^{\infty}\frac{1}{10^n} = 1$$

So again I'm inclined to think it converges in any case.

Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

The argument you give is fine. You can also use the root test. Whatever the $\beta_i$ are, the $n$-th root just kills the root in the denominator and you have $$\limsup \frac{\sqrt[n]{\beta_n}}{10} \leq \lim_{n\to \infty} \frac{\sqrt[n]{9}}{10} \leq \frac{9}{10}< 1$$ clearly, which implies absolute convergence.