Finding all polynomials $q,f$ such that $q(C)$ has non-zero kernel and $f(C)$ is invertible

78 Views Asked by At

This question comes from a qualifying exam.

Let $C$ be an $n × n$ real matrix with $n ≥ 3$. (a) For which real polynomials $q$ of degree 2 is the null space of $q(C)$ not the zero subspace? (b) For which real polynomials $f$ of degree $k ≥ 3$ is $f(C)$ invertible?

I've never faced a similar question before in linear algebra. Here is my try on the first item (a). I am still looking for ways to approach item (b).

We identify $C$ with its associated operator on the vector space $V = \mathbb R^n$, which will also be denoted by $C$. Let $m = p_1^{c_1}\cdots p_s^{c_s}$ be the decomposition of the minimal polynomial of $C$ into irreducible factors (possibly not in linear factors). By the primary cyclic decomposition theorem, we know that $$V = (\langle v_{1,1}\rangle\oplus \cdots \oplus \langle v_{1,k_1} \rangle) \bigoplus \cdots \bigoplus ( \langle v_{s,1} \rangle \oplus \cdots \oplus \langle v_{s,k_s} \rangle)$$ where $\langle v_{i,j} \rangle$ denotes the cyclic subspace generated by $v_{i,j}$, with the minimal polynomial of the restriction of $T$ on each $\langle v_{i,j} \rangle$ being exactly $p_{i}^{c_{i,j}}$ and such that $c_{i,1} = c_1 \geq c_{i,2} \geq \cdots \geq c_{i,k_i}$.

It seems that we should be taking any product $p$ of a subcolletion of the polynomials $p_i^{c_{i,j}}$ such that the sum of the degrees $c_{i,j}$ add up to $2$. Since those polynomials are relatively prime with one another when $p_i\neq p_k$, the product $p$ will be the minimal polynomial of the sum of the corresponding vectors $v_{i,j}$, so $p(C)$ will have a non-trivial kernel. However, I am unsure on how to prove, in a clear manner, that there exists degree coefficients $c_{i,j}$ adding up to $2$.

2

There are 2 best solutions below

2
On

Suppose $\lambda_i$, $(i=1,2,...)$ is the eigenvalue of matrix $C$.

a) Construct a polynomial $q(x)=(x-\lambda_i)g(x)$, where $g(x)=ax+b; a(\ne 0)\in\mathbb R, b\in\mathbb R$ is linear function. In case, $\lambda_i$ is complex then choose $g(x)=(x-\overline\lambda_i)$. Matrix $q(C)$ has determinant zero so it will have a non-trivial nullspace.

b) Consider any polynomial $f(x)$ of degree $k\ge 3$ such that $f(\lambda_i)\ne 0$ and $f(0)\ne0$. Matrix $f(C)$ cannot have zero determinant and hence $f(C)$ is invertible.

0
On

For (a) let $m(x)$ be the minimal polynomial of $C$ and write it as a product of irreducibles (which are at most degree 2 by Fundamental Theorem of Algebra). If $q$ is not in a principle ideal generated by one these irreducibles (i.e. $q$ is not divisible by each irreducible or more succinctly: $m$ and $q$ have non-zero resultant) then (i) $q(C)$ is invertible and conversely (ii) $q(C)$ has nontrivial kernel when $q$ is in one of said ideals. (A different way of stating this: assume WLOG that $q$ is monic and if irreducible, check to see if it matches a degree 2 irreducible of $A$ and if $q$ is reducible into linear factors check each of them against the linear factors of $A$.)

In case of (ii) if $q(C)$ was invertible then we could find an even smaller degree annihilating polynomial contradicting $m$'s minimality.

In case of (i) if you don't want to work over a splitting field, then you could probably do an argument with Bezout's identity / resultants but the expedient argument is to use Real Schur form of $C$ and observe $q$ cannot kill any diagonal block that is $2\times 2$ by uniqueness of minimal polynomial argument and the $1\times 1$ blocks are trivial -- $q(\lambda)\neq 0$ otherwise $\lambda$ would be a factor of $Q$. Conclude $q(C)$ is invertible since it is similar to a block diagonal matrix with each said (diagonal) block being invertible.

(b) write $f$ as a product of irreducibles and if there are no common factors with $m$ then repeatedly apply (i). And if there is a common factor then apply (ii).