How to use Exercise 25 or Exercise 26 to prove $\operatorname{Re}\mathbf{x}$ and $\operatorname{Im}\mathbf{x}$ are linearly independent?

62 Views Asked by At

I am reading "Linear Algebra and its Applications 5th Edition" by David C. Lay, Stephen R. Lay and Judi J. McDonald.

I proved Theorem 9.
My questions are the following two questions:

  1. How to use the results of Exercise 25 or Exercise 26 to prove that if $\mathbf{x}$ is an eigenvector for a complex eigenvalue, then $\operatorname{Re}\mathbf{x}$ and $\operatorname{Im}\mathbf{x}$ are linearly independent in $\mathbb{R}^2$?
  2. I didn't use the result of Exercise 25 at all. Where should I use the result of Exercise 25?

My proof of Theorem 9:
Let $A$ be a real $2\times 2$ matrix with a complex eigenvalue $\lambda=a-bi (b\neq 0)$ and an associated eigenvector $\mathbf{v}$ in $\mathbb{C}^2$.
Let $\mathbf{v}=\begin{pmatrix}e+fi\\g+hi\end{pmatrix}$.
Since $A$ is a real matrix and $\lambda\in\mathbb{C}-\mathbb{R}$, so $\begin{pmatrix}f\\h\end{pmatrix}\neq \begin{pmatrix}0\\0\end{pmatrix}$.
Assume that $\begin{pmatrix}e\\g\end{pmatrix}$ and $\begin{pmatrix}f\\h\end{pmatrix}$ are linearly dependent.
Then there exists $\begin{pmatrix}\alpha\\\beta\end{pmatrix}\neq\begin{pmatrix}0\\0\end{pmatrix}$ such that $\alpha\begin{pmatrix}e\\g\end{pmatrix}+\beta\begin{pmatrix}f\\h\end{pmatrix}=\begin{pmatrix}0\\0\end{pmatrix}$.
Since $\begin{pmatrix}f\\h\end{pmatrix}\neq \begin{pmatrix}0\\0\end{pmatrix}$, $\alpha\neq 0$.
So, $\begin{pmatrix}e\\g\end{pmatrix}=\gamma\begin{pmatrix}f\\h\end{pmatrix}$ for some real number $\gamma$.
So, $\mathbf{v}=(\gamma+i)\begin{pmatrix}f\\h\end{pmatrix}$.
So, $(\gamma+i)A\begin{pmatrix}f\\h\end{pmatrix}=A\mathbf{v}=\lambda\mathbf{v}=\lambda(\gamma+i)\begin{pmatrix}f\\h\end{pmatrix}$.
So, $\mathbb{R}^2\ni A\begin{pmatrix}f\\h\end{pmatrix}=\lambda\begin{pmatrix}f\\h\end{pmatrix}\in\mathbb{C}^2-\mathbb{R}^2$.
This is a contradiction.
So, $\begin{pmatrix}e\\g\end{pmatrix}$ and $\begin{pmatrix}f\\h\end{pmatrix}$ are linearly independent.
So, $P$ is invertible.
By Exercise 26, $AP=[A(\operatorname{Re}\mathbf{v}) A(\operatorname{Im}\mathbf{v})]=[\operatorname{Re}\mathbf{v} \operatorname{Im}\mathbf{v}]\begin{pmatrix}a&-b\\b&a\end{pmatrix}=PC$.
So, $A=PCP^{-1}$.

Theorem 9
Let $A$ be a real $2\times 2$ matrix with a complex eigenvalue $\lambda=a-bi (b\neq 0)$ and an associated eigenvector $\mathbf{v}$ in $\mathbb{C}^2$. Then $$A=PCP^{-1},$$ where $P=[\operatorname{Re}\mathbf{v},\operatorname{Im}\mathbf{v}]$ and $C=\begin{pmatrix}a & -b\\ b & a\end{pmatrix}.$

The proof uses the fact that if the entries in $A$ are real, then $A(\operatorname{Re}\mathbf{x})=\operatorname{Re}(A\mathbf{x})$ and $A(\operatorname{Im}\mathbf{x})=\operatorname{Im}(A\mathbf{x})$, and if $\mathbf{x}$ is an eigenvector for a complex eigenvalue, then $\operatorname{Re}\mathbf{x}$ and $\operatorname{Im}\mathbf{x}$ are linearly independent in $\mathbb{R}^2$. (See Exercises 25 and 26.) The details are omitted.

Exercise 25
Let $A$ be a real $n\times n$ matrix, and let $\mathbf{x}$ be a vector in $\mathbb{C}^n$. Show that $\operatorname{Re}(A\mathbf{x})=A\operatorname{Re}(\mathbf{x})$ and $\operatorname{Im}(A\mathbf{x})=A\operatorname{Im}(\mathbf{x})$.

Exercise 26
Let $A$ be a real $2\times 2$ matrix with a complex eigenvalue $\lambda=a-bi (b\neq 0)$ and an associated eigenvector $\mathbf{v}$ in $\mathbb{C}^2$.
a. Show that $A(\operatorname{Re}\mathbf{v})=a\operatorname{Re}\mathbf{v}+b\operatorname{Im}\mathbf{v}$ and $A(\operatorname{Im}\mathbf{v})=-b\operatorname{Re}\mathbf{v}+a\operatorname{Im}\mathbf{v}$. [Hint: Write $\mathbf{v}=\operatorname{Re}\mathbf{v}+i\operatorname{Im}\mathbf{v}$, and compute $A\mathbf{v}$.]
b. Verify that if $P$ and $C$ are given as in Theorem 9, then $AP=PC$.