What is wrong with this Jordan normal form computation?

378 Views Asked by At

The question I am working on is to compute the Jordan normal form of $$A := \begin{pmatrix} 2 & 1 & 5 \\ 0 & 1 & 3\\ 1 & 0 & 1\end{pmatrix}.$$ The characteristic polynomial and minimal polynomial of $A$ is $x^{2}(x - 4)$. Then the Jordan normal form of $A$ is given by $$J := \begin{pmatrix} 0 & 1 & 0\\ 0 & 0 & 0\\ 0 & 0 & 4 \end{pmatrix}.$$ Then there exist a matrix $P$ such that $P^{-1}AP = J$.

I am having an issue finding $P$. From the theory of Jordan normal forms, $P = [w_{1}\, w_{2}\, w_{3}]$ where $w_{1}, w_{2}$ is the basis of the nullspace of $(A - 0\cdot I)^{2}$ (where $I$ is the identity matrix) and $w_{3}$ is the basis of the nullspace of $(A - 4\cdot I)$. We first consider $(A - 0 \cdot I)^{2}$. As $$A^{2} = \begin{pmatrix} 9 & 3 & 18\\ 3 & 1 & 6\\ 3& 1 & 6\end{pmatrix} \sim \begin{pmatrix} 1 & 1/3 & 2 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$$ (where the $\sim$ denotes row equivalence), the nullspace of $A^{2}$ is spanned the column vectors $\{(-2, 0, 1)^{t}, (-1, 3, 0)^{t}\}$. Next we consider $(A - 4 \cdot I)$. We have $$A - 4I = \begin{pmatrix} -2 & 1 & 5\\ 0 & -3 & 3\\ 1 & 0 & -3 \end{pmatrix} \sim \begin{pmatrix} 1 & 0 & -3\\0 & 1 & -1\\ 0 & 0 & 0\end{pmatrix}.$$ Then the nullspace is spanned by the column vector $(3, 1, 1)^{t}$.

Therefore we should have $$P = \begin{pmatrix} -2 & -1 & 3\\0 & 3 & 1\\ 1 & 0 & 1 \end{pmatrix}.$$ However, when I compute $P^{-1}AP$, I get $$P^{-1}AP = \begin{pmatrix} -1 & -1 & 0\\ 1 & 1 & 0\\0 & 0 & 4\end{pmatrix} \neq J.$$ Where did I go wrong? Is there something wrong on how I computed $P$?

3

There are 3 best solutions below

0
On

To calculate the eigenvectors for $\lambda=0$ you want to find a chain. You need to find $\vec{u}_1$ and $\vec{u}_2$ such that $A\vec{u}_2=\vec{u}_1$ and $A\vec{u}_1=0$. Clearly there exists such a solution as zero is an e-value of $A$ hence $A$ is singular. For example, $\vec{u}_1 = ( -2, -6, 2)$ and $\vec{u}_2 = (-1,-6,0)$. I believe these will put the matrix in the form you desire.

2
On

I'll add a short explanation why the matrix $P$ can be computed in this way.

You want to find a matrix $P$ such that $P^{-1}AP=J$ which is equivalent to $$AP=PJ.$$ Let us denote the columns of the matrix $P$ by $\vec u_1$, $\vec u_2$, $\vec u_3$. (I will work with column vectors here. We want $P$ to be regular, so these vectors have to be linearly independent.) So we have rewritten the matrix $P$ as $P=\begin{pmatrix} \vec u_1 & \vec u_2 & \vec u_3 \end{pmatrix}$.

Now we notice that $$AP= A\begin{pmatrix} \vec u_1 & \vec u_2 & \vec u_3 \end{pmatrix} = \begin{pmatrix} A\vec u_1 & A\vec u_2 & A\vec u_3 \end{pmatrix}.$$ (If this is not clear to you, think a little bit about the definition of the multiplication of matrices.)

What can we say about the matrices $J$ and $PJ$? Since we have two eigenvalues $\lambda_{1,2}$, and multiplicity of $\lambda_1$ is two, we have two possibilities $$J= \begin{pmatrix} \lambda_1 & 0 & 0 \\ 0 & \lambda_1 & 0 \\ 0 & 0 & \lambda_2 \end{pmatrix} \qquad\text{or}\qquad J= \begin{pmatrix} \lambda_1 & 1 & 0 \\ 0 & \lambda_1 & 0 \\ 0 & 0 & \lambda_2 \end{pmatrix} $$ (BTW you did not mention in your post how you know that in this case you get the second possibility.)

In the first case we have $$ PJ= \begin{pmatrix} \vec u_1 & \vec u_2 & \vec u_3 \end{pmatrix} \begin{pmatrix} \lambda_1 & 0 & 0 \\ 0 & \lambda_1 & 0 \\ 0 & 0 & \lambda_2 \end{pmatrix}= \begin{pmatrix} \lambda_1\vec u_1 & \lambda_1\vec u_2 & \lambda_2\vec u_3 \end{pmatrix}.$$ So if we want $AP=PJ$, i.e., $$\begin{pmatrix} A\vec u_1 & A\vec u_2 & A\vec u_3 \end{pmatrix}= \begin{pmatrix} \lambda_1\vec u_1 & \lambda_1\vec u_2 & \lambda_2\vec u_3 \end{pmatrix},$$ then we want find linearly independent vectors $\vec u_{1,2,3}$ such that $A\vec u_1=\lambda_1\vec u_1$, $A\vec u_2=\lambda_1\vec u_2$ and $A\vec u_3= \lambda_2 \vec u_3$. This means that $\vec u_{1,2}$ are eigenvectors for $\lambda_1$ and $\vec u_3$ is an eigenvector corresponding to $\lambda_2$.

Based on what you wrote in the OP, it seems that you calculated eigenvectors and you did not find two linearly independent eigenvectors for $\lambda_1=0$. So we must try the second possibility for $J$.

In this case we have $$ PJ= \begin{pmatrix} \vec u_1 & \vec u_2 & \vec u_3 \end{pmatrix} \begin{pmatrix} \lambda_1 & 1 & 0 \\ 0 & \lambda_1 & 0 \\ 0 & 0 & \lambda_2 \end{pmatrix}= \begin{pmatrix} \lambda_1\vec u_1+\vec u_2 & \lambda_1\vec u_2 & \lambda_2\vec u_3 \end{pmatrix}.$$ Now $AP=PJ$, i.e. $$\begin{pmatrix} A\vec u_1 & A\vec u_2 & A\vec u_3 \end{pmatrix}= \begin{pmatrix} \lambda_1\vec u_1+\vec u_2 & \lambda_1\vec u_2 & \lambda_2\vec u_3 \end{pmatrix},$$ is equivalent to $A\vec u_1=\lambda_1\vec u_1+\vec u_2$, $A\vec u_2=\lambda_1\vec u_2$ and $A\vec u_3= \lambda_2 \vec u_3$.

So if you already have an eigenvector $\vec u_2$, you need to find generalized eigenvector $\vec u_1$ fulfilling $A\vec u_1=\lambda_1\vec u_1+\vec u_2$ or, equivalently, $$(A-\lambda_1I)\vec u_1=\vec u_2.$$


It is important to mention that situation would be a little bit more complicate if we had characteristic polynomial of the form $(x-\lambda_1)^3$. There would be more possibilities for the Jordan normal form $J$ and also in the case $J=\begin{pmatrix} \lambda_1 & 1 & 0 \\ 0 & \lambda_1 & 0 \\ 0 & 0 & \lambda_1 \end{pmatrix}$ we would have two linearly independent eigenvectors for the eigenvalue $\lambda_1$, but not every eigenvector would have a generalized eigenvector. So if we use this way to compute the matrix $P$, we would have to be careful with our choice of eigenvectors and with ordering the obtained vectors in the final matrix.

0
On

Just to mention a quick and effective way of determining the Jordan canonical forms of matrices of small size. It is based on the dot-diagrams associated to the matrix under consideration.
General Case

Let $A$ have a split characteristic polynomial. For example, this restricts nothing at all, if $A\in \text{Mat}_n(\mathbb C)$. Now, for an eigen-value $\lambda$, if there are $k$ chains of eigen-vectors, i.e. if $\{\beta_1,\ldots,\beta_k\}$ forms a basis for the generalised eigen-space $K_\lambda$, where $$\beta_i=\{v_i,\ldots,T^{p_i}(v_i)\}---(\times)$$ for some $v_i$ and $p_i$, then we construct the diagram $$\begin{matrix}\cdot&\cdot&\ldots&\cdot\\\cdot&\cdot&\cdots&\\\cdot&\cdots&&\\\cdot&&&\end{matrix},$$ where the $i$-th column has $p_i$ dots and there are $k$ columns.
Then the number of dots in the first $r$ rows is equal to $\text{dim}(\text{Ker}(T-\lambda I)^r)$. Therefore $\begin{cases}r_1=\text{dim}(\text{Ker}(T-\lambda I))\\r_i=\text{rank}(T-\lambda I)^{i-1}-\text{rank}(T-\lambda I)^i&i\gt 1\end{cases}$.

And the involved ranks can then be easily computed, in the case of matrices of small size.
Our Case

The only trouble occurs with the eigen-value $\lambda=0$. Now, in the above notations, $\begin{cases}r_1=1\\r_2=2-1=1\end{cases}$. Hence the dot-diagram is of the form $\begin{matrix}\cdot\\\cdot\end{matrix}$, and the Jordan block is of the form $\begin{pmatrix}0&1\\0&0\end{pmatrix}$.

After the determination of the form of the Jordan blocks, it remains only to find some basis of the generalised eigen-space consisting of chains of generalised eigen-vectors, i.e. of the above-required form $(\times)$. But this is easy: in our case, we have to find a chain of length $2$. It then suffices to find two linearly independent $v_i$ such that $(A-0I)v_i\not=0$, which is not so difficult I suppose. These two $v_i$ do not belong to the eigen-space corresponding to $0$, so one of them belongs to the space corresponding to $1$. The other vector must then satisfy $A^2v=0$. And this gives us a desired chain: $\{v, Av\}$.
If anything is inappropriate, just inform me. Thanks in advance.