Proving a Polynomial Identity

263 Views Asked by At

Prove that

$$\sum_{i=1}^{n} \dfrac{{r_{i}}^k P(x)}{P'(r_{i})(x-r_{i})} = x^k$$

where $P(x)$ is an $n$ degree polynomial having distinct roots $\{ r_{i} \}_{i=1}^{n}$ and $k$ is an integer, $0 \leq k \leq (n-1)$

I solved it by taking a polynomial $Q(x) = \displaystyle \sum_{i=1}^{n} \dfrac{{r_{i}}^k P(x)}{P'(r_{i})(x-r_{i})} - x^k$ and noting that it has at most $n-1$ degree but $n$ roots i.e, $r_{1}, \ r_{2}, \ \ldots \ , \ r_{n-1}, \ r_{n}$

Thus $Q(x)$ is identically $0$.

I'm seeking other approaches to the problem.

Any help will be appreciated.
Thanks.

2

There are 2 best solutions below

0
On BEST ANSWER

Naturally, the left hand-side is the Lagrange interpolation polynomial for the points $r_1,\dots,r_n$ and the values $r_1^k,\dots,r_n^k$, hence the claim. But this is kind of cheating, because it uses the same idea you want to avoid: two polynomials of degree at most $n-1$ agree at $n$ points, they must be equal. Thus, I give two alternative approaches.


The first uses the Chinese remainder theorem. Let wlog the leading coefficient of $P$ be equal to $1$. Consider the ring (which is a principle integral domain) $\mathbb{K}[x]$ and elements $u_i(x) = (x-r_i)\in \mathbb K[x]$, $i=1,\dots,n$. These are coprime elements, so for any $y_i\in \mathbb K[x]/(u_i \mathbb K[x])$, $i=1,\dots,n$, there is a unique element $q\in \mathbb K[x]/(u \mathbb K[x])$, where $u=u_1\cdots u_n = P$, such that $q = y_i \pmod {u_i}$ for all $i=1,\dots n$. But the elements of $\mathbb K[x]/(u \mathbb K[x])$ may be identified with polynomials of degree at most $n-1$. So in view of Bezout theorem, there is a unique polynomial $Q$ of degree at most $n-1$ such that $P(r_i) = y_i(r_i)$ for all $i = 1,\dots,n$. Taking $y_i = r_i^k + u_i \mathbb{K}[x]$ and noting that $Q(x) = x^k$ satisfies these equations, we arrive the required statement.


The second approach is to solve the system of equations $$ \sum_{i=1}^n r_i^k z_i(x) = x^k,\quad k=0,\dots,n-1, $$ in $\mathbb K(x)$. Cramer's rule gives $$ z_i(x) = \frac{\operatorname{det} A_i}{\operatorname{det} A}, $$ where $$ A = \begin{pmatrix} 1 & 1 & \cdots & 1 \\ r_1 & r_2 & \cdots & r_n \\ r^2_1 & r^2_2 & \cdots & r^2_n \\ \vdots & \vdots & \ddots &\vdots \\ r_1^{n-1} & r_2^{n-1} & \cdots & r_n^{n-1}, \end{pmatrix} $$ and $A_i$ is $A$ with $i$th column replaced by $(1,x,\dots,x^{n-1})^\top$. Both $A$ and $A_i$ are Vandermond matrices, so we arrive, after making all cancellations, at $$ z_i(x) = \frac{\prod_{j\neq i} (x-r_j)}{\prod_{j\neq i} (r_i-r_j)} = \frac{P(x)}{P'(r_i)(x-r_i)}, $$ which yields the desired polynomial identities.

3
On

OK, as zhoraster wrote, the first method is simply recognizing the Lagrange interpolation polynomial $$ \prod_{j\ne i}^n \frac{z-r_j}{r_i-r_j} = \frac{P(x)}{P'(x)\cdot(x-r_j)}. $$ So, we are interpolating the points $(r_i,r_i^k)$ by a polynomial and and wonder why do we get $x^k$ as result. :-)


A different approach, using complex analysis (so it will work for complex polynomials only): let $x$ be a fixed complex number, other than $r_1,\ldots,r_n$, and consider the residues of the function $$ f(z) = \frac{z^k P(x)}{P(z)\cdot (x-z)}. $$ The degree of this rational function is $k-(n+1)\le-2$, so the sum of its residues is $0$.

The function $f(z)$ has simple poles at $r_1,\ldots,r_n$ and $x$, its residues are $\dfrac{r_j^kP(x)}{P'(r_j)\cdot (x-r_j)}$ at $z=r_j$, and $-x^k$ at $z=x$. So, $$ \sum \mathrm{Res}\,\, f = \sum_{j=1}^n \frac{r_j^kP(x)}{P'(r_j)\cdot (x-r_j)} + (-x^k) = 0. $$