Let $\{x_n\}_{n \in \mathbb{N}}$ be a sequence such that $$ x_1 > 0, \qquad x_{n+1} = x_n + \frac{1}{3 x_n^2} $$ Check whether the sequence converges, and if so find its limit.
So I know that a sequence converges if it is bounded and monotone. So I proved those two (hopefully correct):
Proving that $x_n > 0$ for all $n$:
Base: $x_1 > 0$
Hypothesis: $x_n > 0$
Step: $$ \text{$x_n > 0$ and $\frac{1}{3 x_n^2} > 0$} \implies x_n + \frac{1}{3 x_n^2} > 0 \implies x_{n+1} > 0 $$Proving that the sequence is monotone increasing: $$ x_{n+1} - x_n = x_n + \frac{1}{3 x_n^2} - x_n = \frac{1}{3 x_n^2} > 0 \qquad \text{for all $n$} $$
Now, I've tried finding the limit, like this: $$ \exists \alpha = \lim_{x \to \infty} x_{n+1} \implies \alpha = \alpha + \frac{1}{3 \alpha^2} \implies \frac{1}{3 \alpha^2} = 0 $$
Where I end up with a conclusion that the limit is equal to infinity, meaning that the sequence diverges. So what am I doing wrong here? Is one of my proofs incorrect or is my limit calculation wrong?
Thanks
Your proof is fine, but you may simply notice that $x_{n+1}^3 \geq x_n^3+1$ implies $x_n\geq\sqrt[3]{n-1+x_1^3}$.