Proving whether the limit of a sequence will always converge to 0?

74 Views Asked by At

Let $(b_{n})$ be a sequence of real numbers such that $0<(b_{n})<1$. And let Let $(a_{n})$ be a sequence of real numbers defined by $a_{n+1}=a_{n}b_{n}$ where $a_{1}=1$. Is it always true that $lim_{n\to \infty} a_{n}=0$?

I know that $a_{n}=b_{1}b_{2}...b_{n-1}$. I also know that it is bounded below by zero and is strictly decreasing. I should use the MCT somehow, but I'm stuck.

Since, I want to prove that the limit is always 0.

$|a_{n}-0|=|\frac{c_{1}...c_{n-1}}{d_{1}...d_{n-1}}-0|<\epsilon$, where $b_{n}=\frac{c_{n}}{d_{n}}$.

2

There are 2 best solutions below

2
On BEST ANSWER

Let $a_n\rightarrow a$ with $a_1=1$ be monotonic decreasing with $a>0.$ Now consider $b_n:=\frac{a_{n+1}}{a_n}$. Clearly $b_n$ is in $\left(0,1\right).$ Thus the answer is no.

0
On

This is a somewhat more explicit answer than Mr. Max's excellent answer. Let $a_1=1$ and $$(b_n)=(\frac 1{1.9},\frac {1.9}{1.99}, \frac {1.99}{1.999},...)$$ Clearly, each $b_n\in(0,1)$ and $$ a_n=\frac 1{1.99...9} $$ where there are in total of $n-1$ nines. Therefore $a_n\to \frac 12 \ne 0$.