While reading Steven Finch's amazing book Mathematical Constants I once encountered Grossman's constant. This is an interesting constant $c$ defined as the unique $x_1\in\mathbb{R}$ such that the sequence $\{x_n\}_{n=0}^\infty$ defined by the recurrence:
$$x_{n}=\frac{x_{n-2}}{1+x_{n-1}}$$
for $n\ge0$ with $x_0=1$ converges, where $c\approx$$\;0.73733830336929...$. This seems like quite a remarkable theorem and I have no idea how to go about proving that a recurrence of this form converges for a single value, although it seems to have something to do with the limiting behaviour of the odd and even terms. I do not have access to the paper referenced by Finch and MathWorld in which the proof is apparently given, so I am wondering at the very least what techniques were used to prove it.
My question is: Does anyone know of (or can come up with) a proof (or even the idea of a proof) that this sequence converges for a unique $x_1$? Also, is any closed form for $c$ yet known?
This is not an answer but here is a collection of facts about the sequences :
If $x_0,x_1 \ge 0$ then $x_n \ge 0$ forall $n$, and $x_{n+2} = \frac{x_n} {1+x_{n+1}} \le x_n$, so that the two sequences
$(x_{2n})$ and $(x_{2n+1})$ are decreasing, so they have limits $l_0$ and $l_1$.
If the limit of one of the subsequences is nonzero, then the other sequence converges to $0$ exponentially, so one of them has to be $0$. Then we have to prove that forall $x_0 \ge 0$ there is a unique $x_1 \ge 0$ such that the sequence converges to $0$.
A long computation shows that
$(x_{n+3} - x_{n+2}) - (x_{n+1} - x_n) = \frac {x_n^2 x_{n+1}}{(1+x_{n+1})(1+x_n+x_{n+1})} \ge 0$,
and so the sequences $(x_{2n+1}-x_{2n})$ and $(x_{2n+2}-x_{2n+1})$ are increasing. In particular, as soon as one of them gets positive, we know that the sequence will not converge. Conversely, if $(x_{2n})$ doesn't converge to $0$ then $(x_{2n+1})$ converges to $0$ and so we must have $x_{2n+1} - x_{2n} > 0$ at some point, and similarly for the other case.
This means that $(x_n)$ converges to $0$ if and only if it stays decreasing forever, and we can decide if a particular sequence doesn't converge to $0$ by computing the sequence until it stops decreasing.
It also follows that the set $\{(x_0,x_1) \in\Bbb R_+^2\mid \lim x_n = 0\}$ is a closed subset of $\Bbb R_+^2$.