Permutation and characteristic polynomial of a matrix

415 Views Asked by At

I have a matrix $A\in \mathbb{R}^{n\times n}$, after some manipulations I find: $$\tilde{A}=\begin{pmatrix} 0 & I_n&0&\cdots&0\\0&0&I_n&\cdots&0\\ \vdots&\vdots& \vdots&\ddots &\vdots\\0&0&0&\cdots&I_n\\ A & 0 &0 &\cdots &0 \end{pmatrix};\; \mathcal{A}=\begin{pmatrix} I_n&0&\cdots&0&0\\0&I_n&\cdots&0&0\\ \vdots& \vdots&\ddots &\vdots&\vdots\\0&0&\cdots&I_n&0\\ 0 & 0 &\cdots &0&A \end{pmatrix} ; \tilde{A},\mathcal{A}\in\mathbb{R}^{kn\times kn}$$ by simple permutation $\tilde{A}$ becomes $\mathcal{A}$.

Q: my objective is to find if there exist, a relation between eigenvalues of $A$ and $\tilde{A}$ (so $\mathcal{A}$) in term of characteristic polynomial?

Is there any relation between $P_\mathcal{A}(\lambda)$ and $P_\tilde{A}(\lambda)$, where $\mathcal{A}$ is a simple rotation of $\tilde{A}$? or just a relation between their eigenvalues?

Thanks a lot,

1

There are 1 best solutions below

3
On

I see no easy way to relate the eigenvalues of $\tilde A$ and $\mathcal A$ directly. However, both of these have eigenvalues that are neatly related to those of $A$.

Characteristic polynomial of $\mathcal A$: Note that the vectors of the form $$ v = \pmatrix{x_1\\\vdots \\ x_{k-1} \\ 0}, \quad v = \pmatrix{0\\ \vdots\\ 0 \\ w} $$ where $x_i \in \Bbb C^n$ are arbitrary and $w$ is an eigenvector of $A$ can be used to form an eigenbasis of $\mathcal A$. Conclude that $\mathcal A$ has characteristic polynomial $$ p_{\mathcal A}(\lambda) = (\lambda - 1)^{n(k-1)}p_{A}(\lambda) $$ Although we assumed that $A$ is diagonalizable, this carries over to the case where $A$ fails to be diagonalizable by the continuous dependence of the characteristic polynomial on matrix entries. We could also make an argument appealing to the Jordan form of $A$.

Characteristic polynomial of $\tilde A$: a vector $v = (x_1,\dots,x_k)$ will be an eigenvector of $\tilde A$ if and only if we have $$ \pmatrix{x_2\\ \vdots \\ x_k\\ Ax_1} = \lambda\pmatrix{x_1\\ \vdots \\ x_{k-1} \\ x_k} $$ That is, the vectors $x_i$ must satisfy $$ x_i = \lambda x_{i-1} \qquad i=2,\dots,k\\ Ax_1 = \lambda x_k $$ By substituting, we see that this amounts to $$ Ax_1 = \lambda^k x_1 $$ So, $\lambda^k$ will be an eigenvalue of $\tilde A$ whenever $\lambda$ is an eigenvalue of $A$. We can use this to conclude that $$ p_{\tilde A}(\lambda) = p_A(\lambda^k) $$