Prove that these sets of polynomials have real and distinct roots.

261 Views Asked by At

Can anyone tell me if the following set of polynomials have a special name? $$P_{0}(x)=1,P_{1}(x)=x$$ $$P_{n}(x)=xP_{n-1}-P_{n-2}$$ The above gives: $$P_{2}(x)=x^2-1;P_{3}(x)=x^3-2x;P_{4}(x)=x^4-3x^2+1;P_{5}(x)=x^5-4x^3+3x$$ So $P_{n}(x)$ has parity $(-1)^n$. I was trying to find out whether they are orthogonal, but couldn't find a suitable weight function. My main concern is to prove that $P_{n}(x)$ has n distinct real roots, all larger than or equal to -2.

2

There are 2 best solutions below

1
On

Your polynomials are indeed a special case of classical orthogonal polynomials. According to Abramowitz/Stegun 22.7.6 you have $$P_n(x) = S_n(x)= U_ n\left(\frac{x}{2}\right)$$ where $U_n(x)$ is the well-known Chebyshev polynomial of the second kind.

The weight function for the interval $(-2,2)$ is $$w(x)=\left(1-\frac{x^2}{4}\right)^{1/2}$$

And of course this means that the root are simple, distinct and located in the interval $(-2,2).$ For a proof see e.g. my answer Proof the Legendre polynomial $P_n$ has $n$ distinct real zeros .


The orthogonality of $P_n$ follows from the correspending property of $U_n$ and $\sin$ , see e.g. https://www.sciencedirect.com/science/article/pii/0377042793901485:

With $x=\cos\theta$ and $\sin \theta = (1-x^2)^{1/2}$ you have $$U_n(\cos \theta) = \frac{\sin\big((n{+}1)\theta\big)}{\sin\theta}$$ so $$(1-x^2)^{1/2}U_n(x) = \sin\big((n{+}1)\theta\big)$$

(Although I did not see any fully formulated proof yet, maybe a direct proof from the recursion can be modelled after https://planetmath.org/orthogonalityofchebyshevpolynomialsfromrecursion)

0
On

In what follows the explicit expression for the roots of the polynomials $P_n(x)$ will be derived. The statement "the roots are all distinct, real and less than 2 by absolute value" follows immideately.


Consider a family of $n\times n$ bidiagonal matrices: $$\begin{align} A^{(n)}_{ij}=&\delta_{i-j,1}+\delta_{j-i,1}, \end{align}\tag{1}$$ given below for $n=5$ as example: $$ A^{(5)}=\begin{pmatrix} 0&1&0&0&0\\ 1&0&1&0&0\\ 0&1&0&1&0\\ 0&0&1&0&1\\ 0&0&0&1&0 \end{pmatrix}. $$

Lemma 1. The eigenvalues of the matrix (1) are: $$ \begin{align} \lambda_m=2\cos\frac{\pi m}{n+1},& \text{with associated eigenvectors } u_{mk}=\sin\frac{\pi m}{n+1}k, \end{align}\tag{2} $$ where $m$ and $k$ run from 1 to $n$.

Though it would suffice for the proof to let the matrix $A$ act on the given vectors, we present below an extended "constructive" version.

Assume the elements of an eigenvector $u$ have the form: $$ u_k=e^{\alpha k}+ae^{-\alpha k},\tag{3} $$ with some parameters $a$ and $\alpha$, which are to be found.

Obviously for all $k=2\dots(n-1)$ $$ (Au)_k=\left(e^{\alpha k}+ae^{-\alpha k}\right)\left(e^\alpha+e^{-\alpha}\right) =\left(e^\alpha+e^{-\alpha}\right)u_k.\tag{4}$$

Thus it remains only to find such $a$ and $\alpha$ that the equation (4) is satisfied for $k=1$ and $k=n$ as well.

For $k=1$: $$ e^{\alpha 2}+ae^{-\alpha 2}=\left(e^\alpha+e^{-\alpha}\right)\left(e^\alpha+a e^{-\alpha}\right) \Leftrightarrow 1+a=0.\tag{5} $$

For $k=n$: $$ e^{\alpha (n-1)}+ae^{-\alpha(n-1)}=\left(e^\alpha+e^{-\alpha}\right)\left(e^{\alpha n}+a e^{-\alpha n}\right) \Leftrightarrow e^{\alpha (n+1)}+ae^{-\alpha(n+1)}=0.\tag{6} $$

It follows: $a=-1$, $\alpha=\frac{\pi m}{n+1}i$, where $m$ is an integer number. Plugging the values into (3) and (4) one obtains (2).

As all $n$ eigenvalues are distinct, Lemma 1 is proved.

Lemma 2. The characteristic polynomials of negated matrix (1): $$ Q_n(x)\equiv\left|A^{(n)}+x I^{(n)}\right|, $$ where $I^{(n)}$ is $n\times n$ dimensional identity matrix, are the polynomials in question: $$Q_n(x)=P_n(x)\tag{7}.$$

For $n=1$ and $n=2$ the equality (7) is obvious. Assume that (7) is valid for all $n<N$. Then it is valid for $n=N$ as well.

Indeed, applying the Laplace expansion to matrix $A^{(N)}$ (with $N>2$) one readily obtains: $$ Q_N(x)=x Q_{N-1}(x)-Q_{N-2}(x)\stackrel{I.H.}{=}x P_{N-1}(x)-P_{N-2}(x)=P_N(x).\tag{8} $$

Thus, by induction Lemma 2 is proved.

Now, as the eigenvalues of a matrix are exactly the roots of its characteristic polynomial,

Lemma 3. The roots of $P_n(\lambda)$ are: $$ \lambda^{(n)}_m=2\cos\frac{\pi m}{n+1}, \quad m=1\dots n $$ is a simple Corollary of Lemmas 1 and 2.