Let $c > 0$ be chosen arbitrarily and $b_0 \in ]0,\frac{1}{c}[$ given. For $n \in \mathbb{N}$ we set $b_{n+1} := 2b_n - c(b_n)^2$. Find out whether it's convergent or not, and if so, find its limit.
I'm not quite sure how to tackle this. I first wanted to try calculating the first few terms of this sequence, but then I saw the open interval for $b_0$ and felt helpless.
Edit: Someone suggested defining $r_n := 1 - cb_n$ with $r_{n+1} = r^2_n$.. Can someone explain?
As suggested by geetha290krm, by induction $b_n<\frac{1}{c}$ and $b_n$ is increasing, therefore convergent (to $\frac{1}{c}$). Equivalently, the sequence $r_n=1-cb_n$ is decreasing minored by $0$, therefore convergent (to $0$).