Roots of a matrix equation

141 Views Asked by At

Let $X \in \mathbb{C}^{n \times m} $ be a rectangular matrix of full rank, and $X^*$ its hermitian conjugate, let $A \in \mathbb{C}^{m \times m}$ a square matrix, and let $f: \mathbb{C} \to \mathbb{C}$ be defined by $$ f(z) = \det \left( I_n + X \frac{1}{A - z I_m} X^* \right) $$ where $I_n$ is the $n \times n$ identity matrix.

Show that for $n \leq m$ solutions of $f(z)= 0$ are given by $z$ equal to an eigenvalue of $B : = A + X^* X$.

My attempt: for $n = m$, $f(z)$ can be straightforwardly rearranged to $$ f(z) = \det \left( X \frac{1}{B-A} \big( B - I_m z \big) \frac{1}{A - z I_m} X^* \right) = \frac{\det(X) \det(B - I_m z) \det(X^*)}{\det(A - z I_m) \det(B - A)} $$ where the desired result follows from the factor $\det(B - I_m z)$.

However I cannot factorise the determinant in this way for $n < m$, and I am unsure how to proceed.

3

There are 3 best solutions below

1
On BEST ANSWER

With the Sylvester determinant identity, we have $$ \begin{aligned} \det\left(I_n + \left( X\frac 1{A - zI_m}\right)X^*\right) &= \det\left(I_m + X^*\left( X\frac 1{A - zI_m}\right)\right) \\ & =\det\left(I_m + \big(X^*X \big) \frac 1{A - zI_m}\right) \\ & =\det\left(\big(A - zI_m + X^*X\big)\frac 1{A - zI_m}\right) \\ & =\frac{\det\Big(\big(A + X^*X\big) - zI_m)\Big)}{\det(A - zI_m)}. \end{aligned} $$

1
On

By the Schur complement $$\det(z I_m-A)\,\det\left(I_n - X \frac{1}{z I_m-A} X^*\right) = \det(I)\det \left(z I_m-A-X^*X\right).$$ So $z$ should be the the eigenvalues of $A+X^*X$ but not of those of $A$.

5
On

Lemma: $A$ is an $m\times n$ size matrix and $B$ is an $n\times m$ size matrix. $AB$ and $BA$ have the same nonzero eigenvalues.

Proof: Let $\lambda\neq0$ be an eigenvalue of $AB$ and $x$ a corresponding eigenvector, thus $ABx=\lambda x\neq0$. Obviously $Bx\neq0$. $$BA(Bx)=B(ABx)=B(\lambda x)=\lambda Bx.$$ So $\lambda$ is also an eigenvalue of $BA$ with an eigenvector $Bx$. In other words, the set of nonzero eigenvalues of $AB$ is contained in the set of nonzero eigenvalues of $BA$. The symmetric argument leads to the desired conclusion. $\quad\square$

I attempted to prove $AB$ and $BA$ have the same eigenvalues together with their algebraic multiplicities. After writing this above lemma, I realized that probably the best way to prove that is simply the Schur's complement. There is also another similar method to prove the same. In that case

But the above proposition is stronger than $\det(I+AB)=\det(I+BA)$. But then again we can just appeal to my other answer directly applying the Schur's complement.

Alternatively, we can confined ourselves to the narrow circumstance of this particular problem at hand, which only asks for the determinant to be zero. As @BenGrossman suggested, $\det(I+AB)=0 \iff AB$ has eigenvalue $-1 \iff BA$ has eigenvalue $-1\iff \det(I+BA)=0$. The rest proceeds via factorization by setting $A:=X$ and $B:=\frac 1{A - zI_m}X^*$.