How to derive the eqation form of a recurrsive relation?

53 Views Asked by At

The four theorems which tells us how to convert recursion relation into equation can be verified but induction. But do we derive them at the first place ?

1

There are 1 best solutions below

2
On

The technique for Solving homogeneous linear recurrence relations with constant coefficients is old enough that I don't know that you'll find a definitive reference as to how it originated.

A plausible guess, however, is that someone noticed that the $1^{st}$ order homogeneous linear recurrence $f(n) = a \cdot f(n-1)$ has the geometric progression $f(n)=a^n \cdot f(0)$ as the solution.

When stepping up to $2^{nd}$ order recurrences $f(n) = a f(n-1) + b f(n-2)$ it is only natural to wonder if those might, by any chance, have solutions in terms of geometric progressions as well.

To check that, one would assume a $f(n)=x^n$ form, then replacing in the definition of the recurrence it follows that $x^n = a x^{n-1} + b x^{n-2}$ or, after simplifying, $x^2 = a x + b$. Therefore any root $x_{1,2}$ of the characteristic polynomial $x^2 - a x - b$ provides a solution to the recurrence and, since the recurrence is linear, any combination $f(n) = \lambda x_1^n + \mu x_2^n$ satisfies the recurrence as well, where the constants $\lambda, \mu$ remain to be determined by the initial conditions $f(0), f(1)$.

( Looking back at the $1^{st}$ order recurrence form as to reconcile the terms, it can be rewritten as $f(n)=\lambda x_1^n$ where $x_1$ is the root of the characteristic polynomial $x-a=0$, and $\lambda = f(0)$ is the constant determined by the initial condition $f(0)$. )

The above extends to homogeneous recurrences of higher orders in a straightforward way.