Using polynomials as recursions

108 Views Asked by At

I made this observation in my discrete math course a while back. I explored it further online, so not all the ideas contained are mine alone. I still am confused about some things, though. Consider the equation $x^2-x-1=0$ whose positive root is the golden ratio $\phi$. Rewriting the equation we get $x^2=x+1$, and for nonzero $x$ we can divide to get $x=1+\frac1x$. Now $x$ is defined recursively in terms of itself, so plugging it back into this defining equation we get $x=1+\frac1{1+\frac1x}$, and by continuing indefinitely we produce a continued fraction which converges exactly to $\phi$. Returning to the quadratic, we see also that if $x^2=x+1$, then by taking the square root we have $x=\sqrt{x+1}$; since again $x$ is defined recursively in terms of itself we can plug it back into the equation to get $\sqrt{\sqrt{x+1}+1}$ and so on, to produce an infinitely iterated root that also converges to $\phi$.

What is this process/idea called where you find a recursion that 'solves itself'? By that I mean that we had an expression for $x$ in terms of itself, which can be used iteratively to compute $x$. How can one recognize that iterated roots or fractions can be rewritten as polynomials and vice versa?

Also, I find this argument of 'plugging $x$ into itself' to be very ad-hoc and non-rigorous. Is there a more rigorous framework in which one could view this kind of thing? And is there a way to tweak this idea so that a recursion for the negative root of the quadratic can be found?

2

There are 2 best solutions below

0
On BEST ANSWER

You are discovering fixed point iteration, which can be a powerful technique. You arrange your equation in the form $x=f(x)$, then start with some $x_0$ that you hope is close enough to the solution and iterate $x_{n+1}=f(x_n)$ to convergence. If $f(x)$ is differentiable at the limit $L$ and $|f'(L)| \lt 1$ then it will converge in some neighborhood of $L$. Intuitively, the distance from the limit is multiplied by something close to $f'(L)$ each step. There are two parts of cleverness-you need to find a way of arranging your equation so $|f'(L)| \lt 1$ and you need to find a starting value that is close enough. If this process converges, you have $f(L)=L$. Many times this expression lets us find the limit, which is the opposite direction from your approach. For $x^2-x-1=0$ the quadratic formula gets us there, but suppose we wanted to solve $x^{10}+x-1=0$. We could think that if $x$ is a little smaller than $1$, then $x^{10}$ will be a lot smaller than $1$, so the solution is close to $x=1$ but a little smaller. So I will write the recursion as $x_{n+1}=\sqrt[10]{1-x_n}$ and start with $x_0=0.99$. It converges quickly to $x\approx 0.835079069$. A spreadsheet makes this easy-copy down is your friend.

0
On

Fix $N\geq2$ and define $$\begin{array}{ccc} T\colon&\left[1+\frac{1}{N},N\right]&\to&\left[1+\frac{1}{N},N\right] \\ &x &\mapsto &1+\frac{1}{x} \end{array}$$ and note that $$|T(x)-T(y)| = \left|\frac{1}{x}-\frac{1}{y}\right| = \left|\frac{y-x}{xy}\right|\leq\frac{|x-y|}{\left(1+\frac{1}{N}\right)^2}.$$ Thus, by the Banach fixed-point theorem $T$ has a fixed-point, which is given by $\displaystyle\lim_{n\to\infty}T^n(x)$ for any $x\in\left[1+\frac{1}{N},N\right]$.

Next, define $$\begin{array}{ccc} T\colon &[1,\infty) &\to &[1,\infty) \\ &x&\mapsto&\sqrt{1+x} \end{array}$$ and note that $$\begin{array}{} |T(x)-T(y)| &= |\sqrt{1+x}-\sqrt{1+y}| \\ &\leq |\sqrt{1+x}-\sqrt{1+y}|\cdot\frac{1}{2\sqrt{2}}|\sqrt{1+x}+\sqrt{1+y}| \\ & = \frac{|x-y|}{2\sqrt{2}}. \end{array} $$ Thus, by the same argument, this $T$ has a fixed point given by $\displaystyle\lim_{n\to\infty} T^n(x)$ for any $x\in[1,\infty)$.