How to find all matrices simultaneously fulfilling two different polynomial equations?

51 Views Asked by At

Say we have two different polynomial equations for a matrix $\bf P$:

$$\cases{\displaystyle \sum_0^{N_1} c_{k1} {\bf P}^k = {\bf 0}\\\displaystyle\sum_0^{N_2} c_{k2} {\bf P}^k = {\bf 0}}$$

How can we characterize / describe all possible solutions $\bf P$ ?

Own work

Inspired by the previous question regarding single polynomial equation, let us denote solutions for the eigenvalues for matrices solving only one:

$$\mathcal S_1 = \{\lambda_{11},\cdots, \lambda_{N_11}\}$$

$$\mathcal S_2 = \{\lambda_{12},\cdots, \lambda_{N_22}\}$$

So that:

$$\cases{\displaystyle \sum_0^{N_1} c_{k1} {\lambda_{k1}}^k = 0\\\displaystyle\sum_0^{N_2} c_{k2} {\lambda_{k2}}^k = 0}$$

Now let us build a new set $\mathcal S = \mathcal S_1\cap \mathcal S_2 $

And then we draw eigenvalues from $\mathcal S$, so that every element is present at least once.

When we are done drawing we do same as in previous question, put the drawn $\lambda$s into diagonal matrix $\bf D$

$${\bf P = SDS}^{-1}$$

Will this suffice to be able to build all possible matrices $\bf P$ which solve both the above equations?

1

There are 1 best solutions below

1
On BEST ANSWER

Satisfying multiple polynomials is the same as satisfying a single polynomial (the GCD of the original polynomials) - even if you had an infinite set of polynomials to satisfy!

In particular, suppose we're working over a field $F$; if $P$ is a $n\times n$ matrix, we can always take any polynomial $a_0+a_1x+\ldots+a_kx^k$ in $F[x]$ and evaluate it at $P$ - just by evaluating $a_0I+a_1P+\ldots+a_kP^k$. The important fact is that this forms a map $\operatorname{eval_P}:F[x] \rightarrow \operatorname{Mat}_{n\times n}(F)$. This is a ring homomorphism, since it respects addition and multiplication. Note that a matrix $P$ "satisfies" a polynomial $f$ if and only if $\operatorname{eval}_P(f) = 0$.

This lets us tap into ring theoretic reasoning: we are really asking that the kernel of $\operatorname{eval}_P$ contain two different polynomials. However, the kernel is an ideal, and we know that $F[x]$ is a principal ideal domain: every ideal is generated by a single polynomial - which can be computed as the GCD of the original polynomials. Thus, "satisfying a set of polynomials" (even an infinite set) always reduces to "satisfies some single polynomial" - although, a word of warning, is that this GCD can sometimes be a (non-zero) constant polynomial, which is never satisfied, which would mean that the set of polynomials is not satisfied by any matrix.

Another way to state this is that the kernel of $\operatorname{eval}_P$ is always just the set of multiples of some polynomial $m$. This polynomial (often assumed to be monic, to ensure uniqueness) is known as the minimal polynomial of $P$. When you say that a matrix satisfies some polynomial, you're really saying that its minimal polynomial divides that polynomial - and there are a number of good tools for describing which matrices have which minimal polynomial.

It's worth noting that, in the general case of a single polynomial, the reasoning of the question to which you link does not extend; for instance, a matrix can satisfy the equation $P^2=0$ without being the zero matrix. In general, for an algebraically closed field such as $\mathbb C$, every matrix can be written in Jordan canonical form and its minimal polynomial will be the product of the various values of $(x-\lambda)^k$ where $\lambda$ runs over the eigenvalues and $k$ is the size of the largest block with that eigenvalue - for fields that are not algebraically closed, one may use rational canonical form to work similarly.