analysis of matrix multiplication via singular-value-decomposition?

59 Views Asked by At

Consider a square real matrix $\mathbf{A}$ and a real symmetric matrix of the same size $\mathbf{S}$

If I can diagonalise $\mathbf{A}$, such that $\mathbf{A}=\mathbf{B}\mathbf{D}\mathbf{B}^{-1}$, then I can easily analyse t products of $\mathbf{A}$, that is, $\mathbf{A}^{t}$ as I have $\mathbf{A}^{t}=\mathbf{B}\mathbf{D}^{t}\mathbf{B}^{-1}$.

Is there a "similar" way to simplify $\mathbf{E}(t)=\big(\mathbf{A}^{t}\big)\mathbf{S}\big({\mathbf{A}^{t}}^{T}\big)$, which is symmetric for all $t\geq 0$ ?

PS: from this, my aim is to understand the "behavior" of the sum $\sum_{t\geq 0}\mathbf{E}(t)$

1

There are 1 best solutions below

3
On

We can vectorize $E(t)$ as

$$ \operatorname{vec}(E(t)) = (A^t \otimes A^t) \operatorname{vec}(S) = (A \otimes A)^t \operatorname{vec}(S) $$

where $\otimes$ is the Kronecker product. If $A$ has eigenvalues in the unit circle, then the sum converges. Therefore,

$$ \operatorname{vec} \left(\sum_{t\geq0} E(t) \right) = (I - A \otimes A)^{-1} \operatorname{vec}(S) $$

Of course you can make calculations much easier by using decompositions of $A$. In particular, you can use Schur decomposition to make $A$ upper triangular and easily solve

$$ (I - A \otimes A) \operatorname{vec} \left(\sum_{t\geq0} E(t) \right) = \operatorname{vec}(S) $$

for the elements of the sum one by one.

Another alternative would be using SVD of $A$.

Edit. Another approach would be solving the following equivalent Lyapunov equation:

$$ E - A E A^T = S $$

where $E := \sum_{t\geq0} E(t)$.