What would a proof for this theorem look like?
Suppose that $(a_n)_n$ is a sequence such that $a_{n+1}/a_n$ tends to $\ell$. Prove that if $\ell>1$ and $a_n>0$ for all $n$ in the naturals, then $(a_n)_n$ tends to infinity.
What would a proof for this theorem look like?
Suppose that $(a_n)_n$ is a sequence such that $a_{n+1}/a_n$ tends to $\ell$. Prove that if $\ell>1$ and $a_n>0$ for all $n$ in the naturals, then $(a_n)_n$ tends to infinity.
On
Since $l>1$, there exists $N$ such that, for $n\ge N$, $$ \frac{a_{n+1}}{a_n}>k>1 $$ (take, for instance, $k=(l+1)/2$). Therefore, assuming $a_N>0$, $$ a_{N+1}>ka_{N},\quad a_{N+2}>ka_{N+1}>k^2a_{N},\quad\dots $$ and, by induction, $$ a_{N+p}>k^pa_N $$ for every $p$.
If $a_N<0$, then $$ a_{N+p}<k^pa_N $$
If $\dfrac{a_{n+1}}{a_n} \to L$ then, for any $c > 0$ there is a $n(c)$ such that $\dfrac{a_{n+1}}{a_n} \gt L-c$ for $n > n(c)$.
Since $L > 1$, let $d = L-1 > 0$. We can choose $c < d/2$.
Then, for $n > n(c)$, $\dfrac{a_{n+1}}{a_n} \gt L-c \gt L-(L-1)/2 =(1+L)/2 =(1+1+d)/2 =1+d/2 $ so $\dfrac{a_{n+k}}{a_n} \gt (1+d/2)^k \to \infty $ as $k \to \infty$.