I'm trying to prepare for an upcoming test. This is a question I found on a previous midterm that I've been trying to solve:
Consider the sequence of functions defined on [0,1] given by:
$f_0(x)=x$, $f_1(x)=x(1-x)$,...,$f_{n+1}(x)=f_n(x)(1-f_n(x))$
Prove that $f_n$ converges to $0$.
I'm not quite sure how to prove this.
One idea I had was to try to use Theorem 7.9 in Rudin by defining:
$M_n = \sup_{x\in [0,1]} |f_n(x)|$
and trying to show that $M_n \to 0$ as $n \to \infty$ but that hasn't gotten me anywhere. Can anyone give me a hand?
Thank you kindly.
$(1).$ For $x=0$ or $x=1:$ We have $f_1(x)=0,$ and by induction on $n\ge 1$ we have $f_n(x)=0$ for all $n\ge 1.$
$(2).$ For $x\in (0,1):$ We have $f_0(x)\in (0,1).$ If $f_n(x)\in (0,1)$ then $f_{n+1}(x)=f_n(x)(1-f_n(x))\in (0,f_n(x)).$ So by induction on $n\ge 0$ the sequence $(f_n(x))_{n\in \Bbb N_0}$ is decreasing and bounded below by $0,$ so it has a limit $L\in\Bbb R,$ and we have
$L=\lim_{n\to\infty}f_n(x)=$ $\lim_{n\to\infty}f_{n+1}(x)=$ $=\lim_{n\to\infty}f_n(x)(1-f_n(x))=L(1-L)=L-L^2.$
So $L=L-L^2,$ so $L^2=0.$
Addendum: We can handle $(1)$ and $(2)$ together by noting that for any $x\in [0,1]$ we have $f_0(x)\in [0,1]$ and we have $f_n(x)\in [0,1]\implies f_{n+1}(x)\in [0,f_n(x)]\subseteq [0,1].$