Recursion with convergent coefficients

43 Views Asked by At

Suppose that we have a linear recurrence $x_{n+1}=a_n x_n + b_n x_{n-1}$, such that $a_n\to a$ and $b_n\to b$, and where the roots $\lambda_1,\lambda_2$ of $x^2-ax-b=0$ are distinct real roots with $0<|\lambda_1|<1<\lambda_2$. If $v_n$ is any solution to the linear recurrence, we can find an upper bound by $v_n = O((\lambda_2+\varepsilon)^n)$ for all $\varepsilon>0$ (as can be seen in this question here).

In general, I don't expect to get similarly a lower bound $\Omega((\lambda_2-\varepsilon)^n)$, since for example the constant zero sequence is a solution (or we can get stuck on a shrinking direction, coming from the smaller root). Is there any simple condition that does give me this lower bound? For example, if I can show somehow that $v_n \to \infty$, is it true that it behaves more or less like $\lambda_2^n$ up to that $\varepsilon$ error?

1

There are 1 best solutions below

0
On

For any one interested, the answer to my question is that the solutions which do not converge to zero must behave like $\lambda_2^n$.

The proof idea is as usual first move to matrices and the recursion $$\left(x_{v-1},v_{n}\right)\left(\begin{array}{cc} 0 & b_{n}\\ 1 & a_{n} \end{array}\right)=\left(v_{n},v_{n+1}\right)$$ By the assumption, the matrices $M_n$ above converge to a diagonalizable matrix $M_\infty$ with eigenvalues $\lambda_1, \lambda_2$. If $P$ is the matrix that we need to conjugate by, namely $D_\infty=PM_\infty P^{-1}$ is diagonal, then we can study instead the matrices $D_n := PM_n P^{-1}$ which converge to $D_\infty$, and more precisely the product $(x_n ,y_n ) =(x_0, y_0) D_1 D_2 \cdots D_k$.

Assuming the larger eigenvalue is on the top left corner of $D_\infty$, if $D_n = D$ for all $n$, then we get that $\frac{y_n}{x_n} = \frac{y_0 \lambda_1^n}{x_n \lambda_2^n} \to 0$. A bit of standard $\varepsilon$ chasing shows that in the general case, if $\frac{y_n}{x_n}$ is bounded (so it doesn't get stuck very close to the direction $(0,1)$ at any point) then we actually must have that $\frac{y_n}{x_n} \to 0$.

Going back to the original recursion, the condition will be that if $|\frac{v_n}{v_{n-1}}|$ grows a little bit fast than $|\lambda_1|^n$, then it will grow at least as fast as $(\lambda_2-\delta)^n$ for any $\delta>0$ as small as we want.