How to prove that add one of the deleted rows and columns to the matrix will not increase the polynomial degree of GCD of largest minors more than 1?

39 Views Asked by At

$\textbf{A}$ is a $N \times N$ matrix and $\textbf{B}$ is a $N \times M$ matrix. The polynomial degree of greatest common divisor of the largest minors of matrix $[\textbf{A}-\lambda \textbf{I} \quad \textbf{B}]$ is $0$, where $\lambda$ is the eigenvalues of $\textbf{A}$ and $\textbf{I}$ is the identity matrix. If add one column and one row to matrix $\textbf{A}$ to get the matrix $[\textbf{A}_1-\lambda \textbf{I} \quad \textbf{B}_1]$, how to prove that the polynomial degree of greatest common divisor of the largest minors of matrix $[\textbf{A}_1-\lambda \textbf{I} \quad \textbf{B}_1]$ smaller than 2?

For example, $$ \textbf{A} = \left( \matrix{0 & 0\cr a_{21} & 0} \right) $$

$$ \textbf{B} = \left( \matrix{b_1 \cr 0} \right) $$ Add one column and row to $\textbf{A}$ to get $[\textbf{A}_1-\lambda \textbf{I} \quad \textbf{B}_1]$ as $$ \left( \matrix{-\lambda & 0 & a_{13} & b_1\cr a_{21} & -\lambda & a_{23} & 0 \cr a_{31} & a_{32} & a_{33}-\lambda & b_3} \right) $$