If $\det(A + t_i B) = 0$ for $n + 1$ distincts $(t_i)_i$, can I find $V, W$ such that $A(V), B(V) \subset W$ and $\dim W < \dim V$?

60 Views Asked by At

Let be $A, B$ two real matrices of order $n$.

Let be $(t_i)_{1 \leq i \leq n + 1}$ a family of $n + 1$ distinct reals.

I would like to show the equivalence between:

  1. $\forall i, \det(A + t_i B) = 0$
  2. $\exists V, W \text{ subspaces of } \mathbb{R}^n \text{ such that } A(V) \subset W, B(V) \subset W, \dim W < \dim V$

$2 \implies 1$ is obvious.

But I have difficulties to show $1 \implies 2$.

What I tried is:

I tried to look to eigenspaces of $A + t_i B$, but I'm unsure of whether it would be relevant. Otherwise, I can take for all $t \in \mathbb{R}^{*}$, $\varepsilon_t \in \ker (A + t B)$ and consider $V = \mathrm{span}((\varepsilon_t)_{t \in \mathbb{R}^{*}})$ (vector space generated by the family).

But, I'm not sure how to consider an interesting $W$ except the null space.

1

There are 1 best solutions below

6
On BEST ANSWER

Note that $det(A+tB)$ is a polynomial of degree $n$ in $t$. Hence if it has $n+1$ many zeros, then it must be the $0$ polynomial.

Now since $det(A+tB)=0$ for all $t$, there exists a kernel element in $\mathbb{R}[t]\otimes\mathbb{R}^n$, i.e. there exists vectors $v_0,\dots,v_m$ such that $(A+tB)(\sum_i t^i v_i)=0$ (w.l.o.g. $v_m\neq 0$). This implies that $B v_m=0$ and $B v_i = -Av_{i+1}$ for $i<m$. Hence $A,B$ together sends the space $\mathbb{R}\cdot\{v_0,\dots,v_m\}$ to $\mathbb{R}\cdot\{Bv_1,\dots,Bv_m\}$ and the result follows.

Edit : In general, a tuple of matrices $(A_1,\dots,A_m)$ is called a compression space if it satisfies 2. It is always true that $2$ implies $det(t_1A_1+\dots+t_m A_m)=0$ for all $t_i\in\mathbb{R}$ but the converse statement holds if and only if $m\leq 2$ or $n\leq 2$. $m=2$ case is dealt in Kronecker-Weierstrass theory of pencils.

Edit 2 : Before giving a proof, maybe an intuition would help. For each $t\in\mathbb{R}$, there is a vector $v(t)$ such that $(A+tB)v(t)=0$. This is because $det(A+tB)=0$. All we need to show is that $v(t)$ can be picked as a polynomial in $t$. Here is the proof :

Considering $t$ as an indeterminate, observe that $A+tB$ is a matrix with entries in $\mathbb{R}[t]$. It is a $\mathbb{R}[t]$ module map from $\mathbb{R}[t]^n$ to $\mathbb{R}[t]^n$. Take the field of fractions $\mathbb{R}(t)=\{\frac{f}{g}\mid f,g\in\mathbb{R}[t], g\neq 0\}$. Then we can also consider $A+tB$ as a linear map from $\mathbb{R}(t)^n$ to $\mathbb{R}(t)^n$. Its determinant is still $0$ so there is a kernel element of the form $$v=\begin{bmatrix} \frac{f_1}{g_1}\\ \frac{f_2}{g_2}\\ \vdots\\ \frac{f_n}{g_n}\\ \end{bmatrix}$$

In other words $(A+tB)v=0$ in $\mathbb{R}(t)^n$. Multiply $v$ by $g_1g_2\dots g_n$ so it is in $\mathbb{R}[t]^n$. The equation $(A+tB)v=0$ still holds hence you obtain a kernel vector where each entry is a polynomial. Thus this is a vector of the form $v=v_n t^n+\dots+v_0\in\mathbb{R}[t]\otimes \mathbb{R}^n\cong\mathbb{R}[t]^n$ such that $(A+tB)v=0$.