Given : $\forall n\in\Bbb N,\quad A_{n+1} = \frac 1 {1+A_n}$ and $A_1 = 0$
Show the sequence converges and find its limit.
Briefly what I did was to create two sub-sequences with an index difference of 2. Doing that I got two recursive formulas that depend on the initial A1, be it 0 for the odds or 1 for the evens. After showing the sequences are bounded, I got stuck showing the sequence is monotonic. Thanks,
Let $\phi=(1+\sqrt{5})/2$ and consider $B_n=|A_n-(\phi-1)|$. Then, $$ B_{n+1} = |A_{n+1}-(\phi-1)| = \left|\frac{1}{1+A_n}-\frac{1}{\phi}\right| = \left|\frac{A_n-(\phi-1)}{(1+A_n)\phi}\right| = \frac{B_n}{(1+A_n)\phi}. $$ Since $0<A_n<1$ for $n>1$ (shown readily by induction: $A_2=1/2$, and $0<\frac{1}{1+1} < \frac{1}{1+A_n} = A_{n+1} < \frac{1}{1+0} = 1$), we have $B_{n+1}<B_n$, which establishes ${B_n}$ as a decreasing sequence. Since $B_n \geq 0$ by construction, it must converge.