convergence of a real sequence

63 Views Asked by At

Consider a real sequence $(x_k)$ for $k=0,1,2,\dots$ as $x_0=1$ and for $k>0$ $$ x_k=x_{k-1}+\frac{\gamma}{k}x_{k-1}^2,\qquad (\gamma>0).$$ I wonder to show that the sequence is bounded (thus convergent). I appreciate any idea for proving that.

Hint: Numerical experiments suggest me that if and only if $\gamma<1$ the sequence is bounded.

1

There are 1 best solutions below

1
On BEST ANSWER

Since $\gamma > 0$ you have that $\{x_k\}$ is increasing and in particular $x_k \ge 1$ for all $k$. Since $$x_k - x_{k-1} = \frac{\gamma}{k} x_{k-1}^2$$ you get $$x_{n} - x_0 = \sum_{k=1}^n (x_k - x_{k-1}) = \sum_{k=1}^n \frac{\gamma}k x_{k-1}^2 \ge \gamma \sum_{k=1}^n \frac 1k.$$ That is, $$x_n \ge 1 + \gamma \sum_{k=1}^n \frac 1k$$ so that the sequence is unbounded as $n \to \infty$.