Matrices that Differ only in Diagonal of Decomposition

151 Views Asked by At

Suppose that $\mathbf A_1$ and $\mathbf A_2$ are $n \times n$ matrices. Are there necessary and sufficient conditions such that there exists $n \times n$ matrices $\mathbf U$ and $\mathbf V$ and $n \times n$ diagonal matrices $\mathbf D_1$ and $\mathbf D_2$ that satisfy

$$ \mathbf A_1 = \mathbf U \mathbf D_1 \mathbf V , \quad \text{and} \quad \mathbf A_2 = \mathbf U \mathbf D_2 \mathbf V?$$


A sufficient condition is for $\mathbf A_1$ and $\mathbf A_2$ to be invertible and have the same eigenvectors, then the result follows from the eigendecomposition. I was hoping to find a necessary condition perhaps using the LDU decomposition.

3

There are 3 best solutions below

3
On BEST ANSWER

I’ll be dealing with $n\times n$ matrices over a field $F$. I’ll try to make you partially pretty happy, because I expect following results which are partial, but concerning most common and simplest cases. I shall call matrices $A_1$ and $A_2$ simultaneously diagonalizable provided there exist diagonal matrices $D_1$, $D_2$ and invertible matrices $U$, $V$ such that $A_1=UD_1V$ and $A_2=UD_2V$. Consider a polynomial matrix $A(x)=xA_1+A_2$ with elements from $F[x]$. Let $r$ is rank of the matrix $A(x)$ and $d_r(A(x))\ne 0$ is its $r$-th determinant divisor. Let

$$d_r(A(x))=(x-x_1)^{r_1}\cdots (x-x_k)^{r_k},$$

where $x_i$ are distinct elements of the field $F$.

I hope to prove the following two propositions and present my ideas.

Proposition 1. If matrices $A_1$ and $A_2$ are simultaneously diagonalizable then no elementary divisor of the matrix $A(x)$ has multiple roots.

Proof idea. Let $D_1$, $D_2$ be diagonal and $U$, $V$ be invertible matrices such that $A_1=UD_1V$ and $A_1=UD_2V$. Then $UA(x)V=D_1x+D_2\equiv D(x)$. Therefore matrices $A(x)$ and $D(x)$ have the same Smith normal form. Then for each $i\le r$, $i$-th determinant divisor $d_i(D(x))$ of the matrix $D(x)$ satisfies the equality

$$d_i(D(x))= (x-x_1)^{\max\{r_1+i-r,0\}}\cdots (x-x_k)^{\max\{r_k+i-r,0\}}.$$

Thus an elementary divisor $\alpha_i(A(x))= \alpha_i(D(x))=\frac{d_i(D(x))}{d_{i-1}(D(x))}$ divides a product $$(x-x_1) \cdots (x-x_k).\square$$

Proposition 2. If $|A_1|\ne 0$ and no elementary divisor of the matrix $A(x)$ has multiple roots then matrices $A_1$ and $A_2$ are simultaneously diagonalizable.

Proof idea. Let $D(x)\equiv D_1x+D_2$ be a diagonal matrix which has an entry $x-x_k$ exactly $r_k$ times for each $k$. Since $\alpha_i| \alpha_{i+1}$ for each $1\le i<r$ and no elementary divisor $\alpha_i(A(x))$ of the matrix $A(x)$ has multiple roots, we can easily see that the matrix $D(x)$ has the same elementary divisors as the matrix $A(x)$, that is $\alpha_i(A(x))=\alpha_i(D(x))$ for each $i$. Thus matrices $A(x)$ and $D(x)$ have the same Smith normal form. Therefore there exist invertible matrices $U(x), V(x)$ with elements from $F(x)$ such that $U(x)A(x)V(x)=D(x)$. Similarly to the proof of Theorem 6 from [Gan, Ch. VI, $\S 4$] we can show (and only here we use that $|A_1|$ is non-zero) that there exist invertible matrices $U, V\in F$ with elements from $F$ such that $UA(x)V=D(x)$. Then $A_1=UD_1V$ and $A_1=UD_2V$. $\square$

I stop now, because I already called to my colleague who is a matrix theorist and got interested in the problem. At Monday he is going to return to Lviv and next I hope to visit for a long talk with tea about this and other matrix related MSE questions. (But it may be hard to reach complete happiness in this imperfect world, so his answer to your question for $\Bbb R[x]$ may be: “Since $\Bbb R[x][y]$ is even not a principal ideal domain, this is a very hard problem (and much more hard when we are dealing with singular matrices) and some results are only in very particular cases”.)

I hope to improve both propositions a bit by using the matrix $A(x,y)=|A_1x+A_2y|$ instead of the matrix $A(x)$, similarly to the beginning of [Gan, Ch. XII]. Unfortunately, these results are not directly applicable to our problem because the author is dealing with number fields.

References

[Gan] Feliks Ruvimovich Gantmakher, The theory of Matrices. (Russian, English editions)

0
On

This answer uses an approach different to that of my first answer.

I shall call matrices $A_1$ and $A_2$ simultaneously diagonalizable provided there exist diagonal matrices $D_1$, $D_2$ and invertible matrices $U$, $V$ such that $A_1=UD_1V$ and $A_2=UD_2V$.

Proposition. Let $A_1$ be an invertible matrix. Then matrices $A_1$ and $A_2$ are simultaneously diagonalizable iff a matrix $A_2A_1^{-1}$ is similar to a diagonal matrix. Moreover, if we are considering the matrices over an algebraically closed field then both these conditions are equivalent to the diagonaility of Jordan normal form of the matrix $A_2A_1^{-1}$.

Proof. If matrices $A_1$ and $A_2$ are simultaneously diagonalizable then

$$A_2A_1^{-1}=UD_2VV^{-1}D_1^{-1}U^{-1}= UD_2D_1^{-1}U^{-1},$$

That is the matrix $A_2A_1^{-1}$ is similar to a diagonal matrix. Conversely, assume that there exists a diagonal matrix $D$ and an invertible matrix $U$ such that $A_2A_1^{-1}=UDU^{-1}$. Put $V=U^{-1}A_1$. Then

$$U^{-1}A_2V^{-1}= U^{-1} UDU^{-1}A_1A_1^{-1}U=D$$ and

$$U^{-1}A_1V^{-1}=U^{-1}A_1A_1^{-1}U=I,$$

Thus the matrices $A_1$ and $A_2$ are simultaneously diagonalizable. $\square$

0
On

Here I quote the classical argument for the special case I mentioned in the comments.

Let $S$ and $T$ be diagonalizable operators on $V$. (Note that $T$ is diagonalizable iff there exists a basis of $V$ consisting of eigenvectors of $T$; I'll use this freely.) It's not so hard to prove that $S, T$ simultaneously diagonalizable $\implies S, T$ commute, so we'll leave that to you. We prove that $S, T$ commute $\implies S, T$ simultaneously diagonalizable.

Let $\lambda_1, \dots, \lambda_m$ be the distinct eigenvalues of $S$ and $V_1, \dots, V_m$ be the corresponding eigenspaces $V_i = \{v \in V : Sv = \lambda_i v\}$. Any bases $e_{i, 1}, \dots, e_{i, d_i}$ of the $V_i$ combine to give a basis $$ e_{1, 1}, \dots, e_{1, d_1}; \dots; e_{m, 1}, \dots, e_{m, d_m} \tag{1} $$ of $V$ with respect to which $S$ has a diagonal matrix.

Each $V_i$ is an $S$-invariant subspace, and since $S, T$ commute, it's also a $T$-invariant subspace. To prove this, let $v \in V_i$. We show $Tv \in V_i$. By definition, $Sv = \lambda_i v$. Applying $T$ gives $TSv = \lambda_i Tv$. Since $S, T$ commute, $S(Tv) = \lambda_i (Tv)$. Hence $Tv \in V_i$, and $V_i$ is $T$-invariant.

Now the matrix of $T$ with respect to the basis $(1)$ is block diagonal since the $V_i$ are $T$-invariant. But it's not necessarily diagonal. For the matrix of $T$ to be diagonal we'd need a basis of $V_i$ consisting of eigenvectors of $T$. However, because $V_i$ is a $T$-invariant subspace, a such a basis exists. For various proofs, see this post*. With respect to this $T$-eigenbasis of $V_i$, the matrix of $S$ is still diagonal (since $(1)$ was arbitrary), and now the matrix of $T$ is diagonal, so we've simultaneously diagonalized $S, T$.

[*If all the eigenvalues of $S$ are different, then the $V_i$ are $1$-dimensional $T$-invariant subspaces; hence the vectors $(1)$ are eigenvectors of $T$ also, and the matrix of $T$ with respect to the basis $(1)$ is also diagonal, completing the proof in this special case. The proof when $V_i$ is not $1$-dimensional is slightly more difficult.]