Finding the asymptotic of the sequence $a_{n+1}=a_n+\frac{1}{f(a_n)}$

95 Views Asked by At

We define $f(x)$ be a differentiable function with $f^\prime\ge 0$, $f^\prime$ bounded and $f\to+\infty$ when $x\to+\infty$. Define the sequence $\{a_n\}$ as follows: $$a_0=1, a_{n+1}=a_n+\frac{1}{f(a_n)},n\in\mathbb{N}_+.$$ My question is, how can we find the asymptotic of $a_n$? For instance, if $f(x)=x$, then clearly $f(x)$ satisfies the conditions and the sequence turned out to be $$a_{n+1}=a_n+\frac{1}{a_n},$$ which we may give an estimate that $a_n=\sqrt{2n}+o(\sqrt{2n})$. However, what about a randomly chosen $f$? How can we compute the asymptotic of $\{a_n\}$? Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

I'll assume that $f$ is strictly positive for $x \ge 1$. Then $(a_n)$ is strictly increasing with $a_n \to +\infty$.

The idea of the following calculation is to show that the sequence $(a_n)$ behaves roughly like the solution $y$ of the differential equation $y' = 1/f(y)$, which is given by $F(y(t)) = t$ where $F$ is an antiderivative of $f$.

Define $F(x) = \int_0^x f(t) \, dt$. Then $$ F(a_{n+1}) - F(a_n) = \int_{a_n}^{a_{n+1}} f(t) \, dt \ge f(a_n) (a_{n+1}-a_n) = 1 $$ and $$ \begin{align} F(a_{n+1}) - F(a_n) &\le f(a_{n+1}) (a_{n+1}-a_n) \\ &= 1 + (f(a_{n+1})-f(a_n))(a_{n+1}-a_n) \\ &\le 1 + M (a_{n+1}-a_n)^2 \\ &= 1 + \frac{M}{f(a_n)^2} \end{align} $$ where $M$ is an upper bound for $f'$. It follows that $$ \lim_{n \to \infty} F(a_{n+1}) - F(a_n) = 1 \, . $$ By the Stolz–Cesàro theorem, this implies that $$ \lim_{n \to \infty} \frac{F(a_n)}{n} = 1 \, , $$ i.e. $a_n \sim F^{-1}(n)$.

Example: For $f(x) = x$ we get $F(x) = x^2/2$ and therefore $a_n \sim \sqrt{2n}$.