So my question is regarding a sequence of numbers defined recursively by $s_n = s_{n-1}(1 + c_n s_{n-1})$, where $(c_n)$ is some sequence of positive numbers when $s_0 = \epsilon > 0$ is small (how small may depend on $(c_n)$). I know if $c_n$ is summable, then $s_n$ converges if $s_0$ is sufficiently small. When $c_n$ is not summable, then I believe $s_n$ doesn't converge, but I don't know how its divergence rate relates to $c_n$. If $s_0$ is small, it seems to diverge slightly faster than the sum of $c_n$ but I don't know how to get more precise than that.
How to find the divergence rate of a recursive sequence defined by $s_n = s_{n-1}(1 + c_n s_{n-1})$
219 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Showing the divergence. It is easy to show (by induction) $(s_n)$ is a positive sequence.
Proposition 1 $(s_n)$ is ascending.
Since $(c_n)$ is positive, then $$s_n-s_{n-1} = s_{n-1}(1 + c_n s_{n-1})-s_{n-1}=s_{n-1}^2c_n\geq 0 \tag{1}$$ or $$s_n\geq s_{n-1}$$
Proposition 2 $$s_n-s_{n-1} \geq \varepsilon^2 c_n \tag{2}$$
From $(1)$ and because $(s_n)$ is ascending $$s_n-s_{n-1} = s_{n-1}^2c_n\overset{(1)}{\geq} s_{n-2}^2c_n\geq ... \geq s_0^2c_n=\varepsilon^2 c_n$$
Proposition 3 If $\sum\limits_{n=1} c_n = \infty$ then $\lim\limits_{n\rightarrow\infty}s_n=\infty$.
$$s_n = s_n-s_0+s_0=(s_n-s_{n-1})+(s_{n-1}-s_0)+s_0=\\ (s_n-s_{n-1})+(s_{n-1}-s_{n-2})+(s_{n-2}-s_0)+s_0=...\\ ...=s_0+\sum\limits_{k=1}^n(s_k-s_{k-1})\overset{(2)}{\geq} s_0+\sum\limits_{k=1}^n \varepsilon^2 c_k=\\ s_0+\varepsilon^2 \sum\limits_{k=1}^n c_k$$ Taking the limit, leads to the desired result.
First, the double exponential lower bound.
Note that $c_n=\frac{S_n-S_{n-1}}{S_{n-1}^2}$. Let $K>0$. The sum of the right hand sides can then be viewed as the left Riemann sum for the function $x\mapsto \frac 1{x^2}$ on $[s_0,+\infty)$ corresponding to the partition with the nodes $s_n$. If we had $s_{n}\le (K+1)s_{n-1}$ for all $n$, that Riemann sum would be comparable to $\int_{s_0}^\infty\frac{dx}{x^2}<+\infty$, which contradicts the divergence of $\sum_n c_n$. Thus, we have $c_n s_{n-1}>K$ somewhere. Since $c_{n+k}\ge c_n\delta^k$ (regularity), we then have the inequality $$ t_k=c_ns_{n+k}\ge \delta^k(c_n s_{n+k-1})^2=\delta^k t_{k-1}^2 $$ for $k\ge 0$. Here $t_{-1}>K$. If we replace the inequalities by identities, the solution to this recursion will be $$ T_k=K^{2^k}\delta^{k+2(k-1)+2^2(k-2)+\dots+2^{k-1}} $$ However $\sum_{m=1}^{\infty} 2^{-m}m=\sigma<+\infty$, so we get $t_k>T_k\ge [K\delta^\sigma]^{2^k}$, which is a double exponential growth if the product in brackets is greater than $1$.
To get the upper bound, it is enough to assume that $c_n\le C-1$ and start with $s_0=1$. Then we have $(Cs_n)\le (Cs_{n-1})^2$, so $s_n\le C^{2^n}$. Now, by choosing $s_0$ small enough, we can keep $s_n$ below $1$ for any time $n_0$ we want, so we can shift this bound to $C^{2^{n-n_0}}=\exp(2^{-n_0}\log C\cdot 2^n)$. The rest should be clear.