Why do we substitute $\alpha^n$ in the recurrences of the form $ax_n=bx_{n-1}+cx_{n-2}$?

232 Views Asked by At

I encountered the following recurrence relation $2x_n-3x_{n-1}+x_{n-2}=0$ with $x_0=1$ $x_1=1$.I did not have any idea how to go about this.However, google pointed me to page 18 of Herbert Wilf's Algorithms and Complexity where he substituted $x_n=\alpha^n$, found a quadratic equation and found a solution to the Fibonnaci sequence using the roots of the quadratic equation.Here is the image. It left me with two questions:

1) Why should a general term of the recurrence relation $x_{n+1}=ax_{n}+bx_{n-1}$,($n\ge 1$, $x_0,x_1$ given) be of the form $x_n=\alpha^n$ for some alpha ?

2) Why should $x_n$ be a linear combination of the roots of the the quadratic equation obtained by substituting $x_n=\alpha^n$?

2

There are 2 best solutions below

2
On

Consider a second-order linear difference equation with constant coefficients $$x_{n+1}=ax_n+bx_{n-1}$$ for some $a,b$ given $x_0,x_1$. Now, recognize that if $X_n$ is a solution to our equation, i.e. $X_{n+1}=aX_n+bX_{n-1}$, then so is $kX_n$ for any constant $k$: $$kX_{n+1}=akX_n+bkX_n=k(aX_n+bX_{n-1})$$

Similarly, if $X_n$ and $Y_n$ are solutions to our equation, i.e. $X_{n+1}=aX_n+bX_{n-1}$ and $Y_{n+1}=aY_n+bY_{n-1}$, it follows readily from adding these equations that $X_n+Y_n$ is also a solution. $$X_{n+1}+Y_{n+1}=aX_n+bX_{n-1}+aY_n+bY_{n-1}=a(X_n+Y_n)+b(X_{n-1}+Y_{n-1})$$

It follows then that our solutions form a 2-dimensional vector space (all other axioms are trivial; $x_n=0$ is our zero vector, etc.).

We may guess a solution of the form $x_n=\alpha^n$ to yield: $$\begin{align*}\alpha^{n+1}&=a\alpha^n+b\alpha^{n-1}\\&=\alpha^{n-1}(a\alpha+b)\\&\implies\alpha^2-a\alpha-b=0\end{align*}$$ If this quadratic equation has two distinct roots $\alpha_1,\alpha_2$, we have determined two linearly independent solutions -- a fundamental set of solutions, a basis for our vector space. Thus every solution may be written as a linear combination of these solutions i.e. in the form:$$x_n=c_1\alpha_1^n+c_2\alpha_2^n$$

0
On

For a general linear recurrence relation of the form:

$$x_{n+1} = \sum_{k=0}^{N} a_k x_{n-k}\tag{*}$$

In additional to being linear, it is invariant under translation. This means if $(x_n)$ is a solution of $(*)$, so does the sequence $(y_n)$ defined by $y_n = x_{n+1}$.

Just like when you have two commuting Hermitian matrices, you can look for eigen-vectors for both matrices at the same time. Instead of finding solutions for $(*)$ alone, one can look for solutions which are "eigen-vectors" for the translation operation.

i.e. Looking for solution $(x_n)$ satisfies additional constraint:

$$y_n = \alpha x_n \iff x_{n+1} = \alpha x_n \iff x_n \propto \alpha^n$$