Finding a polynomial to satisfy a matrix equation

145 Views Asked by At

Is there a canonical way of finding a polynomial $p$ such that $$ p\left(\begin{bmatrix} 1& 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end{bmatrix}\right) = \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & -1 \end{bmatrix}? $$ If the diagonal elements of the matrix on the left were all distinct, I could use Lagrange interpolation to find a polynomial mapping $d_i \mapsto d'_i$, but in this case such a polynomial would not be well-defined. To anyone seeking to generalize this, I need the entries on the right hand side to be any of the two square roots of the corresponding entries on the left side. The diagonal entries on the left side are guaranteed to be positive.

1

There are 1 best solutions below

2
On

No, as you point out you can do this if the diagonal elements are distinct, but you can't do it otherwise. The problem is that the diagonal elements of $P(I)$ are all $P(1)$, that is equal.