Jordan normal form and invertible matrix of generalized eigenvectors proof

1.2k Views Asked by At

Struggling to find a place to start with this proof- just began learning about Jordan normal.

Given a 2-by-2 matrix $A$ and a Jordan normal form matrix $J_{\lambda}$, there exists a matrix $S = [v1, v2]$ s.t. $$S^{-1}AS = J_{\lambda}$$ That is, $J_{\lambda}$ is similar to $A$.

WTS: $$(A - \lambda I)v_{1} = 0 $$ $$(A - \lambda I)v_{2} = v_{1}$$


I was going to try to start with the similarity condition and work into producing the two statements below.

Transform the similarity condition to: $$(A = SJ_{\lambda}S^{-1}) * v_{1}$$

Rewrite our first statement to prove as: $$Av_{1} = \lambda I v_{1}$$

Substitute: $$\lambda I v_{1} = SJ_{\lambda}S^{-1}v_{1}$$

But I've been continually getting stuck. Tried a few ways other than this. What intuition am I missing?

1

There are 1 best solutions below

2
On BEST ANSWER

$\newcommand{\x}{\mathbf{x}}$ $\newcommand{\y}{\mathbf{y}}$

First case: The $2\times 2$ matrix $A$ is diagonalizable over a field. That means that there exist $2$ linearly independent right hand eigenvectors $\x,\y$ such that $$A\x = \lambda_1 \x \text{ and } A\y=\lambda_2 \y \tag 1$$ (notice that $\lambda_1$ and $\lambda_2$ may not be distinct).

Now, consider the matrix $S$ which contains the right hand eigenvectors $\x, \y$ as its columns, i.e. $S = \begin{bmatrix} \x & \mid & \y\end{bmatrix}=\begin{bmatrix} x_1 & y_1 \\ x_2 & y_2\end{bmatrix}$. Taking advantage of $(1)$ it holds $$A \cdot S = S\cdot J,\tag 2$$ where $J = \begin{bmatrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{bmatrix}$. Thus, by $(2)$ it holds that $S^{-1} \cdot A \cdot S = J$.


$\newcommand{\v}{\mathbf{v}}$ $\newcommand{\w}{\mathbf{w}}$

Second case: The $2\times 2 $ matrix $A$ is not diagonalizable. That means that the Jordan normal form of the matrix will be of the form $J = \begin{bmatrix} \lambda & 1\\ 0 & \lambda\end{bmatrix}$.

Since $A$ is not diagonalizable, there is only one eigenvalue $\lambda$, which its geometric multiplicity is less than its algebraic one. Thus, there is a right hand eigenvector $\v$ and a generalized right hand eigenvector $\w$ such that: $$\begin{array}[t]{l}A\v = \lambda \v\\ (A-\lambda I) \w =\v. \end{array}$$ Consider the matrix $S = \begin{bmatrix} \v & \mid & \w\end{bmatrix} = \begin{bmatrix} v_1 & w_1 \\ v_2 & w_2 \end{bmatrix}$. Again, taking advantage of the above equalities it holds: $$A \cdot S = S \cdot J.\tag{3}$$


  • Proof of $(3)$.

$A\cdot S =A\cdot\begin{bmatrix} v_1 & w_1 \\ v_2 & w_2 \end{bmatrix} =\begin{bmatrix} A\v &\mid& A\w \end{bmatrix} = \begin{bmatrix} \lambda \v &\mid& \lambda \w + \v \end{bmatrix} = \begin{bmatrix} \lambda v_1 & \lambda w_1 + v_1 \\ \lambda v_2 &\lambda w_2 + v_2 \end{bmatrix} = S\cdot J$