Diagonalizing matrix with fractions

219 Views Asked by At

I'm revising for an exam in linear algebra, and I've found myself stuck on this one specific exercise.

I'm supposed to decide a matrix $P$ and a diagonal matrix $D$ from my matrix $H$ (which I'll post below, so that $P^{-1}HP = D$.

Normally, I know how to solve tasks like these, but the fractions are what's giving me the issues when trying to get the eigenvalues out of the matrix. I'm just clueless on how to get them, so if any of you could help me out I'd greatly appreciate it.

$$H= \begin{pmatrix} \frac{3}{2} & - \frac{1}{2} &0\\ - \frac{1}{2} & \frac{3}{2}&0 \\ 0 &0&1\end{pmatrix}$$

2

There are 2 best solutions below

3
On BEST ANSWER

Note that: $$ |H-\lambda I|= \begin{vmatrix} \frac{3}{2}-\lambda & -\frac{1}{2} & 0 \\ -\frac{1}{2} & \frac{3}{2}-\lambda & 0 \\ 0 & 0 & 1-\lambda \end{vmatrix}=(1-\lambda)\left(\left(\frac{3}{2}-\lambda\right)^2-\frac{1} {4}\right)= (1-\lambda)(2-3\lambda+\lambda^2)=(1-\lambda)(\lambda-1)(\lambda-2)=-(\lambda-1)^2(\lambda-2) $$ (I calculated the determinant using the Laplace expansion of the third column). The eigenvalues are hence $1$ and $2$. Can you take it from here?

0
On

If you’re having trouble computing eigenvalues, try looking for eigenvectors instead. You should be able to tell at a glance that $(0,0,1)^T$ is an eigenvector of $H$ since that vector gets mapped to itself—the columns of $H$ are the images of the basis vectors. The corresponding eigenvalue is, of course, $1$.

Now focus on the upper-right $2\times2$ submatrix. Both rows have the same elements, so the row sums are equal. Summing the first two rows of $H$ is equivalent to right-multiplying by $(1,1,0)^T$, so there’s another eigenvector that’s linearly independent of the first one, with eigenvalue $\frac32-\frac12=1$.

You can always get the last eigenvalue “for free” since the sum of the eigenvalues, taking multiplicity into account, is equal to the trace. So, the remaining eigenvalue is $4-1-1=2$. To find a corresponding eigenvector, recall that a real symmetric matrix can be orthogonally diagonalized, so any eigenvector of $2$ has to be orthogonal to both of the eigenvectors of $1$ that you’ve already found. You’re working in $\mathbb R^3$, so such a vector can be found via a cross product: $(1,1,0)^T\times(0,0,1)^T=(1,-1,0)^T$ (which could also have been found by inspection).