I saw this question online
"We define a series $\{a_i\}_{i=0}^\infty$ like so: $a_0 = 1, \; a_{n+1} = sin(a_n)$ prove that $a_n$ converges"
that is rather easy because if $\forall x>0 ,\;sin(x) < x$ that mean $a_n$ is monotonically decreasing and positive and thus converges.
It's pretty obvious $a_n$ converges to $0$, My question is, at what rate does $a_n$ converges to $0$.
Spoiler, I used my computer to show that the answer is probably $\frac{\alpha}{\sqrt{n}}$ where $\alpha \approx 1.732$ but I wasn't able to prove it mathematically
By the Taylor approximation,
$$\frac{a_{n+1}}{a_n}\approx1-\frac{a_{n-1}^2}6$$ tends to $1$ and the convergence is sublinear.
As
$$\frac{a_{n+2}-a_{n+1}}{a_{n+1}-a_n}$$ tends to $1$, the convergence is said logarithmic, i.e. very slow.