How do I calculate the rank and determinant of this?

65 Views Asked by At

I have a (3$\times$3) matrix $A$ with eigenvalues $\lambda_1$, $\lambda_2$, $\lambda_3$ and corresponding eigenvectors $\alpha_1$, $\alpha_2$, $\alpha_3$. Suppose $\beta=\alpha_1+\alpha_2+\alpha_3$. Now the question is first proving $\beta$, $A\beta$, $A^2\beta$ are linearly independent, and use this result to compute $rank(A-E)$ and $det(A+2E)$ under the assumption $A^3\beta=A\beta$.

I can prove $\beta$, $A\beta$, $A^2\beta$ are linearly independent but I haven't achieved anything on the second part. I have also checked the solution to this problem, and here is its answer:

From $A^3\beta=A\beta$ we have: $$A[\beta,A\beta,A^2\beta]=[A\beta,A^2\beta,A^3\beta]=[A\beta, A^2\beta,A\beta]=[\beta,A\beta,A^2\beta]\left(\begin{array}{ccc}0&0&0\\1&0&1\\0&1&0\end{array}\right)$$ Suppose $P=[\beta, A\beta, A^2\beta]$, it is apparently invertible and thus: $$P^{-1}AP=\left(\begin{array}{ccc}0&0&0\\1&0&1\\0&1&0\end{array}\right)=B$$ So we have $rank(A-E)=rank(B-E)=2$, $det(A+2E)=det(B+2E)=6$.

My question is how do they get matrix $B$, and is there a better way to get these answers?

EDIT: $\lambda_1$, $\lambda_2$ and $\lambda_3$ are different eigenvalues.

2

There are 2 best solutions below

0
On BEST ANSWER

This is just a change of basis in disguise: See what $A$ does do the basis vectors $b_1=\beta$, $b_2=A\beta$, $b_3=A^2\beta$: \begin{align} Ab_1 = A\beta &= 0b_1 + 1b_2 + 0b_3, \\ Ab_2 = A^2\beta &= 0b_1 + 0b_2 + 1b_3, \\ Ab_3 = A^3\beta = A\beta &= 0b_1 + 1b_2 + 0b_3. \end{align} Hence, the map $f\colon \Bbb R^3\to\Bbb R^3, x\mapsto Ax$ has matrix $B$ with respect to the basis $(\beta, A\beta, A^2\beta)$, in other words $B=S^{-1}AS$ for $S=[b_1,b_2,b_3]$.

Now rank and determinant are invariant under change of basis and we note that $$ S^{-1}(A-\lambda E)S = S^{-1} A S - \lambda S^{-1} E S = B - \lambda E, $$ so the same change of basis translates $A-\lambda E$ to $B-\lambda E$, which allows you to take ranks and determinants of that instead.

1
On
  1. The claim that $\beta,A\beta,A^2\beta$ are linearly independent is not true without further assumptions. It is false for, e.g., $A=E=$ identity.

  2. You could have used the fact that $\beta$ is a sum of eigenvectors, then express $A^3\beta$, $A\beta$ in terms of those eigenvectors and eigenvalues. Then derive conditions for the eigenvalues of $A$ (under the assumption that $A$ has three different eigenvalues). Your answer is fine, and much more direct than the approach I sketched.