When Dim eigenspace = 1, any $2\times 2$ complex matrix A is similar to $\left(\begin{array}{ll} \lambda & 1\\ 0 & \lambda \end{array}\right)$.

888 Views Asked by At

$\bbox[5px,border:2px solid gray]{ \text{ Case 3 } }$ If $\dim E_{\lambda}=1$, take a nonzero $v\in E_{\lambda}$, then $\{v\}$ is a basis for $E_{\lambda}$. Extend this to a basis $\mathfrak{B}=\{v,\ w\}$ for $\mathbb{C}^{2}$ by choosing $w\in \mathbb{C}^{2}\backslash E_{\lambda}$. If $Aw = \alpha v+\beta w$, then there exists a transformation matrix $P$ (that transforms the original $\{v\}$ to $\mathfrak{B}$), such that $C =P^{-1}AP=\left(\begin{array}{ll} \lambda & \alpha\\ 0 & \beta \end{array}\right) \quad (♦) $

But in this case 3, $C$ must satisfy $\dim E_{\lambda}=1$. So for $C$, and thence $A$, to have $\lambda$ of algebraic multiplicity of 2, then $ C =\left(\begin{array}{ll} \lambda & \alpha\\ 0 & \lambda \end{array}\right) $. (I omit a paragraph deemed unnecessary)

Moreover, since $ Cw=u+\lambda w, $ if we consider the alternative basis $B' =\{u,\ w\}$ then there is a transformation matrix, say $Q$, such that $Q^{-1}CQ =\left(\begin{array}{ll} \lambda & 1\\ 0 & \lambda \end{array}\right). $ Finally, use $(♦)$ to substitute A for $C$ : $Q^{-1}CQ = (PQ)^{-1}A(PQ)$.

$5.$ Christiaan Hattingh's answer below says: Because "w is not an eigenvector of C, ... $Cw=u+λw$ " ? Why not $ku+λw$ for some $k$?

Then how does he prove that $u = \alpha v$? I don't perceive the "noticing Aw = ..." ?

$6.$ Why is $Q = \begin{bmatrix} \alpha & 0 \\ 0 & 1 \end{bmatrix}$ ? (See his answer to 6)

$9.$ What's the proof strategy? I'm flummoxed by all this algebra.

2

There are 2 best solutions below

5
On BEST ANSWER

5.$\;$Again here, since $w$ is not an eigenvector of $C$ we cannot have $Cw=\lambda w$...so there must be some vector $u$, so that $Cw=u+\lambda w$. In fact we can do better, by noticing $Aw=1\cdot(\alpha v)+\lambda w$, and $A(\alpha v)=\lambda \cdot(\alpha v)$...so in fact $u=\alpha v$.

6.$\;$Again here, just take $Q=\begin{bmatrix} \alpha & 0 \\ 0 & 1 \end{bmatrix}$, and then calculate $Q^{-1}CQ$ to get the required matrix. The verification that this is indeed so, is the same as for point 2 above.

9.$\;$It's a constructive proof, showing you how to find a Jordan Canonical form for a matrix in $\mathscr{M}_{2 \times 2}(\mathbb{C})$ for a matrix that cannot be diagonalized.


I respond separately, to try and clarify as per OP's request in comments:

ok, now to further expound 5. If $Cw \neq \lambda w$, then there must be some other vector which is not a scalar multiple (i.e. linearly dependent) of $w$ so that $Cw$ equals $\lambda w$ plus this other ("unknown") vector. Let us denote this vector as $u$. So then $Cw=u+\lambda w$. We cannot have a third linearly independent vector in this equation, since the dimension of $\mathbb{C}^2$ is 2.

I am not going to add any more to this answer, but here is my suggestion to you - why not try this out in "practice". Take the following matrix: \begin{equation} \begin{bmatrix} \frac{3}{2} & \frac{1}{4} \\ -1 & \frac{5}{2} \end{bmatrix}, \end{equation} find it's eigenvalue and a corresponding eigenvector. Then follow the proof step by step (so find the matrix $P$, calculate $P^{-1}AP$, etc.) and see if you can get to the form \begin{equation} \begin{bmatrix} \lambda & 1 \\ 0 & \lambda \end{bmatrix}. \end{equation}

4
On

Any square (complex) matrix $A$ has at least an eigenvalue $\lambda_1$, with an eigenvector $v_1$. We can complete the eigenvector to a basis $\mathscr{B}=\{v_1,v_2,\dots,v_n\}$ of $\mathbb{C}^n$ ($n$ is the order of $A$). Then, the linear map $f\colon \mathbb{C}^n\to\mathbb{C}^n$ defined by $f(v)=Av$ has, as representing matrix with respect to $\mathscr{B}$ the matrix $$ C=\begin{bmatrix} C_{\mathscr{B}}(Av_1) & C_{\mathscr{B}}(Av_2) & \dots & C_{\mathscr{B}}(Av_n) \end{bmatrix} $$ where $C_{\mathscr{B}}(v)$ denotes the coordinate vector with respect to $\mathscr{B}$, that is $$ C_{\mathscr{B}}(v)= \begin{bmatrix} \alpha_1\\ \alpha_2\\ \vdots\\ \alpha_n \end{bmatrix} \quad\text{if and only if}\quad v=\alpha_1v_1+\alpha_2v_2+\dots+\alpha_nv_n. $$ Now, $Av_1=\lambda_1v_1$, so the first column of $C$ is $$ \begin{bmatrix} \lambda_1\\ 0\\ \vdots\\ 0 \end{bmatrix}. $$ Note that the matrix $C$ is similar to $A$, because $A=PCP^{-1}$, where $P=\begin{bmatrix}v_1&v_2&\dots&v_n\end{bmatrix}$. In this way we can prove, by induction, that every matrix is similar to a triangular matrix.

In the particular case when $n=2$, the matrix $C$ is already triangular: $$ C=\begin{bmatrix} \lambda_1 & t\\ 0 & \lambda_2 \end{bmatrix}\,. $$ Note that, since $A$ and $C$ are similar, the diagonal elements of $C$ are exactly the eigenvalues of $A$: similar matrices have the same eigenvalues and the eigenvalues of a triangular matrix are the coefficients on the diagonal.

Case 1: $\lambda_2\ne\lambda_1$

We can choose $v_2$ to be an eigenvector relative to $\lambda_2$, and in this case we have $t=0$.

Case 2: $\lambda_2=\lambda_1$ and the dimension of the eigenspace is $2$

We can choose $v_2$ to be an eigenvector relative to $\lambda_1$; again $t=0$.

Case 3: $\lambda_2=\lambda_1$ and the dimension of the eigenspace is $1$

There is no way to make $C$ diagonal, because saying that $t=0$ means that $v_2$ is an eigenvector relative to $\lambda_2=\lambda_1$, against the eigenspace having dimension $1$.

If $x$ is such that $\{v_1,x\}$ is linearly independent, then $Ax=\alpha v_1+\lambda_1x$, with $\alpha\ne0$. Note that the coefficient to $x$ is forced by the hypothesis that $A$ has the eigenvalue $\lambda_1$ with algebraic multiplicity $2$ and geometric multiplicity $1$.

If $t\ne0$, setting $v_2=\alpha^{-1}tx$, we have that $$ Av_2=\alpha^{-1}t(Ax)=\alpha^{-1}t\alpha v_1+\alpha^{-1}t\lambda_1 x= tv_1+\lambda_1v_2, $$ which means that we can choose $v_2$ so that the representing matrix with respect to $\mathscr{B}=\{v_1,v_2\}$ is $$ \begin{bmatrix} \lambda_1 & t\\ 0 & \lambda_1 \end{bmatrix} $$ with an arbitrary $t\ne0$. Choosing $t=1$ is surely very reasonable, but not at all required.

You see that the consideration of the generalized eigenvectors in the very simple case of $n=2$ can be avoided. However, it is really necessary when higher order matrices are concerned, because it leads to the concept of Jordan decomposition.