$2\times 2$ matrix similar to a matrix of its eigenvalues' real and imaginary parts

351 Views Asked by At

Suppose $A$ is a $2\times 2$ matrix with the complex eigenvalues $\lambda = \alpha \pm i \beta$.

I need to show that $A = V \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}V^{-1}$, where $V$ is an invertible matrix with columns $v_{1}$ and $v_{2}$. Then, I need to show that $v_{1}+iv_{2}$ is an eigenvector of $\lambda = \alpha + i\beta$.

First of all, this is for a differential equations course, and it has been a very long time since I've done any serious linear algebra. I think what this problem is saying is that I need to show that $A$ is similar to a matrix with entries the $\pm$ real and imaginary parts of its eigenvalues, but I'm not sure how to do that.

Secondly, how do I show that $v_{1}+iv_{2}$ is an eigenvalue of the $\alpha + i\beta$ eigenvalue without explicitly knowing what $v_{1}$ and $v_{2}$ are? (Or will I know? I'm very confused).

I am in a bit over my head with this problem and could really use some guidance. I thank you for your time and patience.

4

There are 4 best solutions below

2
On BEST ANSWER

Assume that $\beta\ne 0$, otherwise we’re dealing with a repeated real eigenvalue, which requires a different analysis. A $2\times2$ matrix with eigenvalues $\alpha\pm i\beta$ has characteristic equation $(\lambda-\alpha)^2+\beta^2=0$. Let $B=A-\alpha I$. By the Cayley-Hamilton theorem, $(A-\alpha I)^2+\beta^2I=0$, therefore $B^2=-\beta^2I$. Let $\mathbf v_1$ be any nonzero real vector and define $\mathbf v_2 = -\frac1\beta B\mathbf v_1$. Then $$B\mathbf v_2 = \frac1\beta B^2\mathbf v_1 = \beta\mathbf v_1$$ and by definition $$B\mathbf v_1=-\beta\mathbf v_2.$$ Let $V = \begin{bmatrix}\mathbf v_1&\mathbf v_2\end{bmatrix}$. Then $$BV\begin{bmatrix}1\\0\end{bmatrix} = B\mathbf v_1 = -\beta\mathbf v_2 = -\beta V\begin{bmatrix}0\\1\end{bmatrix}$$ and $$BV\begin{bmatrix}0\\1\end{bmatrix} = B\mathbf v_2 = \beta\mathbf v_1 = \beta V\begin{bmatrix}1\\0\end{bmatrix}$$ therefore $$BV = V\begin{bmatrix}0&\beta\\ -\beta&0\end{bmatrix}.$$ Finally, $$A = \alpha I+B = \alpha I+V\begin{bmatrix}0&\beta\\-\beta&0\end{bmatrix}V^{-1} = V\begin{bmatrix}\alpha&\beta\\-\beta&\alpha\end{bmatrix}V^{-1}.$$

Proving the second part is a matter of multiplying out $A(\mathbf v_1+i\mathbf v_2) = (\alpha I+B)(\mathbf v_1+i\mathbf v_2)$ for this choice of $\mathbf v_1$ and $\mathbf v_2$ and doing a bit of straightforward algebra.

11
On

Your result stands only if $(*)$ $\beta\not= 0$. We assume $(*)$ and that $A\in M_2(\mathbb{R})$ with $spectrum(A)=\{\alpha \pm i\beta\}$. Since $A$ is diagonalizable over $\mathbb{C}$, then it is similar over $\mathbb{C}$ to $diag(\lambda,\overline{\lambda})$ and then, to $B=\begin{pmatrix}\alpha&\beta\\-\beta&\alpha\end{pmatrix}$ (since $spectrum(B)=spectrum(A))$. Since $(1)$ two real matrices that are similar over $\mathbb{C}$ are also similar over $\mathbb{R}$, there is an invertible $V\in M_2(\mathbb{R})$ s.t. $A=VBV^{-1}$.

EDIT. The proof of $(1)$ is standard. Let $A,B\in M_n(\mathbb{R}),P=P_1+iP_2\in M_n(\mathbb{C})$ be s.t. $A=PBP^{-1}$. Then $AP=PB$ and moreover $AP_1=P_1B,AP_2=P_2B$; then, for every $t\in\mathbb{C}$, $A(P_1+tP_2)=(P_1+tP_2)B$. Consider the function $f:t\in \mathbb{C}\rightarrow \det(P_1+tP_2)$; $f$ is a polynomial that is not identically $0$ because $f(i)\not= 0$; then there is $t\in\mathbb{R}$ s.t. $f(t)\not= 0$ and we are done.

6
On

As loup blanc noticed, your assertions hold only if $\beta \neq 0$. Then you are able to prove $A = V \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}V^{-1}$ by using the spectrum of $A$.

For your second question, let $V=\begin{pmatrix} v_{11} & v_{12} \\ v_{21} & v_{22} \end{pmatrix}$ and look at $AV = V \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}$ column by column since $V= (v_1, v_2)$: $$ Av_1=\begin{pmatrix} \alpha v_{11} - \beta v_{12} \\ \alpha v_{21} - \beta v_{22} \end{pmatrix} $$

$$ Av_2=\begin{pmatrix} \alpha v_{12} + \beta v_{11} \\ \alpha v_{22} + \beta v_{21} \end{pmatrix} $$

Then, $Av_1 + i Av_2 = \lambda (v_1+iv_2)$

0
On

You can use the following $$ \begin{pmatrix} 1 & i \\ 1 & -i \end{pmatrix} \begin{pmatrix} A & -B \\ B & A \end{pmatrix} \begin{pmatrix} \frac12 & \frac12 \\-\frac i2& \frac i2\end{pmatrix} = \begin{pmatrix} A+Bi &\\ &A-Bi \end{pmatrix} $$