The largest root of a recursively defined polynomial

389 Views Asked by At

Suppose that for all $x \in \mathbb{R}$, $f_1(x)=x^2$ and for all $k \in \mathbb{N}$, $$ f_{k+1}(x) = f_k(x) - f_k'(x) x (1-x). $$ Let $\underline{x}_k$ denote the largest root of $f_k(x)=0$.

I want to prove the following conjectures:

  1. $0=\underline{x}_1 < \underline{x}_2 < \underline{x}_3 < \dots < 1$.
  2. $\lim_{n \to \infty} \underline{x}_n = 1$.
  3. $f_k'(\underline{x}_k) \geq 0$ for all $k$.

These conjectures are the missing part of a larger proof and it seems to hold for any $f_k$ that I can compute by hand or numerically. The first few polynomials:

  1. $f_1(x)=x^2$, so that $\underline{x}_1=0$ and $f_1'(\underline{x}_1) = 0$
  2. $f_2(x)=x^2 (2x-1)$, so that $\underline{x}_2=\frac{1}{2} \in (\underline{x}_1,1)$ and $f_2'(1/2)=1/2>0$
  3. $f_3(x)=x^2 (6x^2-6x+1)$, so that $\underline{x}_3 = \frac{1}{2} + \frac{1}{2 \sqrt{3}} \approx 0.7887 \in (\underline{x}_2,1)$ and $f_3'(\underline{x}_3) \approx 2.1547>0$
  4. $f_4(x)=x^2 (2x-1) (12 x^2-12 x+1)$, so that $\underline{x}_4=\frac{1}{2}+\frac{1}{\sqrt{6}} \approx 0.9082 \in (\underline{x}_3,1)$ and $f_4'(\underline{x}_4) \approx 6.5993 > 0$

The polynomials $f_k$ have many properties that should help. For example it is easy to show that $f_k(1)=1$ for all $k$ and $f_k(x) > 1$ for all $x>1$ for all $k$. Therefore all real roots must be strictly below $1$.

2

There are 2 best solutions below

2
On BEST ANSWER

We can start by proving that for all $k\ge1$ $$ f_k(x)=k!\cdot(x-x_0)\cdots(x-x_k) $$ where $$ 0=x_0=x_1<x_2<\cdots<x_k<1. $$ Let's assume this holds for $f_k$ and prove it for $f_{k+1}$.

First, we may note that the derivative $$ f_k'(x)=(k+1)!\cdot(x-y_1)\cdots(x-y_k) $$ where $$ 0=x_0=y_1=x_1<y_2<\cdots<y_k<x_k<1. $$ The $y_1=0$ follows from the double root at zero; the $x_{j-1}<y_j<x_j$ for $j>1$ follows from $f_k(x_{j-1})=f_k(x_j)=0$, alternatively from observing that $f_k'(x_{j-1})$ and $f_k'(x_j)$ have different signs.

The rest simply boils down to observing that the sign of $x(1-x)f_k'(x)$ at the points $x_j$ alternates between positive and negative. Since $f_k(x)$ is zero at all $x_j$, $f_{k+1}(x_j)=x_j(x_j-1)f_k'(x_j)$ also has alternating sign along the points $x_2,\ldots,x_k,1$: ie, $f_{k+1}(1)>0$, $f_{k+1}(x_k)<0$, etc. Thus, $f_{k+1}(x)$ must have zeroes between the points $0=x_1,x_2,\ldots,x_k,1$: say at $z_j$ where $$ 0=z_0=x_0=z_1=x_1<z_2<x_2<\cdots<x_k<z_{k-1}<1 $$ where I have included the double root at zero which is easy to check. There can be no more roots than this due to the degree of $f_{k+1}$.

There's a little extra checking for $y_2$ and $z_2$ because of the double root at zero which makes the sign argument fail there. Alternatively, you could redo the entire argument for $0=x_0<x_1<\cdots$ and just take the limit as $x_1\rightarrow0$.

This proves points (1) and (3). What remains is point (2): that the top root converges to 1.

You have already observed that $f_k(1)=1$ for all $k$. This makes $$ (1-x_2)\cdots(1-x_k)=\frac{f_k(1)}{k!}=\frac{1}{k!} $$ where I ignore the two zero-roots. Since $0<x_j<x_k$ for $j=2$ to $k-1$, this makes $(1-x_k)^{k-1}<1/k!$, which makes $1-x_k<(k!)^{-1/(k-1)}$ which drops towards zero as $k$ increases.


Previous proof of (2) in case of interest:

Recall that, in addition to the two roots at zero, $f_{k+1}$ had roots $z_2,\ldots,z_k,z_{k+1}$ where $z_j<x_j$ for $j=2$ to $k$, so this makes $$ \frac{1}{(k+1)!}=(1-z_2)\cdots(1-z_{k+1}) >(1-x_2)\cdots(1-x_k)(1-z_k)=\frac{1-z_{k+1}}{k!} $$ which makes $1-z_{k+1}<1/(k+1)$. So in general, we get that the top root satisfies $\underline{x}_n>1-1/n$.


Origin of proof

In case someone wonder which hat I pulled this rabbit out of. I was already familiar with this method for proving that polynomials defined recursively like this have distinct roots (in a case where we didn't have the double root at zero). I may have made a slip-up from thinking, throughout writing the main part of the proof, that was what point (1) was.

Thinking about it, I suppose when the only interest is the top root, the argument could be simplified to address only this, not all the lower roots as well.

0
On

From $f_{k+1}(x) = f_k(x) - f_k'(x) x (1-x) $, let $f_k(x) =x^2 g_k(x) $, so $f_k'(x) =x^2 g_k'(x)+2xg_k(x) =x(x g_k'(x)+2g_k(x)) $.

Then

$\begin{array}\\ x^2g_{k+1}(x) &= x^2g_k(x) - x (1-x)x(x g_k'(x)+2g_k(x))\\ &= x^2(g_k(x) - (1-x)(x g_k'(x)+2g_k(x)))\\ \text{so}\\ g_{k+1}(x) &= g_k(x) - (1-x)(x g_k'(x)+2g_k(x))\\ &= g_k(x) - (1-x)x g_k'(x)-2(1-x)g_k(x)\\ &= (1-2(1-x))g_k(x) - (1-x)x g_k'(x)\\ &= (2x-1)g_k(x) - (1-x)x g_k'(x)\\ &= (2x-1)g_k(x) + x(x-1) g_k'(x)\\ \end{array} $

As a check, since $g_1(x) = 1$, $g_2(x) =2x-1 $ and

$\begin{array}\\ g_3(x) &=(2x-1)(2x-1)+x(x-1)(2)\\ &=4x^2-4x+1+2x(x-1)\\ &=6x^2-6x+1\\ \end{array} $

Note that $g_{k+1}(x) = (2x-1)g_k(x) + x(x-1) g_k'(x) =((x^2-x)g_k(x))' $

so that

$\begin{array}\\ g_{k+2}(x) &=((x^2-x)g_{k+1}(x))'\\ &=((x^2-x)((2x-1)g_k(x) + x(x-1) g_k'(x))'\\ &=((x^2-x)(2x-1)g_k(x) + x^2(x-1)^2 g_k'(x))'\\ &=((2x^3-3x^2-x)g_k(x) + (x^4-2x^3+x^2) g_k'(x))'\\ &= (6 x^2-6 x-1) g_k(x)+x (x (x-1)^2 g_k''(x)+(6 x^2-9 x+1) g_k'(x))\\ \end{array} $

Let $G(x, y) =\sum_{k=1}^{\infty} y^kg_k(x) $.

Then $G_x(x, y) =\sum_{k=1}^{\infty} y^kg_k'(x) $ and

$\begin{array}\\ \dfrac1{y}G(x, y) &=\sum_{k=1}^{\infty} y^{k-1}g_k(x)\\ &=\sum_{k=0}^{\infty} y^{k}g_{k+1}(x)\\ &=g_1(x)+\sum_{k=1}^{\infty} y^{k}g_{k+1}(x)\\ &=1+\sum_{k=1}^{\infty} y^{k}((2x-1)g_k(x) + x(x-1) g_k'(x))\\ &=1+\sum_{k=1}^{\infty} y^{k}(2x-1)g_k(x) + \sum_{k=1}^{\infty}x(x-1)y^k g_k'(x)\\ &=1+(2x-1)\sum_{k=1}^{\infty} y^{k}g_k(x) + x(x-1)\sum_{k=1}^{\infty}y^k g_k'(x)\\ &=1+(2x-1)\sum_{k=1}^{\infty} y^{k}g_k(x) + x(x-1)\sum_{k=1}^{\infty}y^k g_k'(x)\\ &=1+(2x-1)G(x, y) + x(x-1)G_x(x, y)\\ \text{or}\\ x(x-1)G_x(x, y) &=G(x, y)(\dfrac1{y}-(2x-1))-1 \end{array} $

Not sure where to go from here, so I'll leave it at this.