Canonical form of a 4x4 matrix with 1 real eigenvalue and 2 linearly independent eigenvectors

547 Views Asked by At

Suppose A is a $4 \times4$ matrix with a single real eigenvalue and two linearly independent eigenvectors. Describe the possible canonical forms for $A$ and show that $A$ may indeed be transformed into one of these canonical forms. Describe explicitly the conditions under which A is transformed into a particular form.

Attempt:Jordan blocks are either of the form $\lambda I$ or $\lambda I$ with 1's on the superdiagonal (above the middle one). Since $A$ is 4x4, we can expect its Jordan form $J$ to be either two $2 \times 2$ blocks or a $3\times 3$ block and a $1$ block....

And then I am kind of lost... I am not sure how having 2 L.I eigenvectors would help/change the situation, and I don't know how to answer the subsequent questions regarding conditions of $A$.

1

There are 1 best solutions below

0
On

Let $ A $ a matrix with a single real eigenvalue $ \lambda $ and two linearly independent eigenvectors .

so $ dim Ker(A-\lambda I )=2 $ and $ dim R(A-\lambda I) =2 $.

Let $ V_{1} $ an eigenvector, choose $ V_{2}\in R(A-\lambda I ) $ such that $(A-\lambda I)V_{2}=V_{1} $

i.e $ AV_{2}=\lambda V_{2}+V_{1} $

Let $ V_{3} $ the second eigenvector, choose $ V_{4}\in R(A-\lambda I ) $ such that $(A-\lambda I)V_{4}=V_{3} $

i.e $ AV_{4}=\lambda V_{4}+V_{3} $

Now we get $ A\begin{pmatrix} & & & \\ V_{1} ; & V_{2} ;& V_{3} ;& V_{4} \\ & & & \end{pmatrix} $ $ =$ $ \begin{pmatrix} & & & \\ \lambda V_{1} ;& \lambda V_{2}+V_{1} ;& \lambda V_{3} ;& \lambda V_{4}+V_{3} \\ & & & \end{pmatrix} $

$ =$ $ \begin{pmatrix} & & & \\ V_{1} ;& V_{2} ;& V_{3} ;& V_{4} \\ & & & \end{pmatrix} $ $ \begin{pmatrix} \lambda & 1 & 0 & 0 \\ 0 & \lambda & 0 & 0 \\ 0 & 0 & \lambda & 1 \\ 0 & 0 & 0 & \lambda \end{pmatrix} $

so $ T^{-1}AT= \begin{pmatrix} \lambda & 1 & 0 & 0 \\ 0 & \lambda & 0 & 0 \\ 0 & 0 & \lambda & 1 \\ 0 & 0 & 0 & \lambda \end{pmatrix} $ whehre $ T= \begin{pmatrix} & & & \\ V_{1} ;& V_{2} ;& V_{3} ;& V_{4} \\ & & & \end{pmatrix} $ invertible because the vectors $ V_{i} $ are linearly independent.

now assume that $ V_{1} $ and $ V_{2} $ are the eigenvectors

choose $ V_{3}\in R(A-\lambda I ) $ such that $(A-\lambda I)V_{3}=V_{2} $

choose $ V_{4}\in R(A-\lambda I ) $ such that $(A-\lambda I)V_{4}=V_{3} $ we get as previously that $ T^{-1}AT= \begin{pmatrix} \lambda & 0 & 0 & 0 \\ 0 & \lambda & 1 & 0 \\ 0 & 0 & \lambda & 1 \\ 0 & 0 & 0 & \lambda \end{pmatrix} $