Consider the following definition by induction:
$$ \begin{align} f(0)&=1\\ f(1)&=1/2\\ f(n)&=\frac{f(n-2)}{f(n-1)+f(n-2)} \end{align}$$
I can see that this defines a series that oscillates between 0.4 and 0.6 (which is good, because these are supposed to be probabilities). How do I determine this analytically? In other words, how do I show (a) that this is a divergent series, and (b) that it diverges on those values?
Basically not an answer, but this might help? (Too long for a comment.)
If $f(n-1)>f(n-2)$ , then: $$f(n)=\frac{f(n-2)}{f(n-1)+f(n-2)}<\frac{f(n-2)}{f(n-2)+f(n-2)}=\frac{1}2$$
If $f(n-1)<f(n-2)$ , then: $$f(n)=\frac{f(n-2)}{f(n-1)+f(n-2)}>\frac{f(n-2)}{f(n-2)+f(n-2)}=\frac{1}2$$
We have $f(0)=1>f(1)=\frac{1}2$ , so $f(2)>\frac{1}{2}$ .
And $f(2)>\frac{1}2=f(1)$ , so $f(3)<\frac{1}{2}$ .
And $f(3)<f(2)$ , so $f(4)>\frac{1}{2}$ .
From here we can see that the series is jumping around $\frac{1}2$. We do some calculation on the very first values and find that it's getting farther and farther away from $\frac{1}2$ , so it should be divergent?
We should wait for a proper proof.