Eigenvalues and eigenspaces in a symmetric matrix

867 Views Asked by At

Consider the following:

$$ Q = \begin{pmatrix} 2 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 2 \end{pmatrix} ,\qquad X= \begin{pmatrix} x \\ y \\ z \end{pmatrix} \in \mathbb R^3 $$

I have these questions:

  1. Is $\lambda = 1$ an autovalue of $Q$? Why?

Yes, because:

$$\lambda \text{ is an autovalue of } A \iff |A - \lambda I_n | = 0$$

and in this case you have:

$$|A - \lambda I_n | = |Q - I_3 | = \begin{vmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{vmatrix} = 0$$

  1. Find all the autovalues of $Q$, and their geometric/algebraic multiplicities.

$$A \text{ is symmetric} \implies A \text{ is diagonalizable}$$

$$A \text{ is diagonalizable} \implies p_A(\lambda) \text{ is totally decomposable}$$

$$p_A(\lambda) \text{ is totally decomposable} \implies \begin{matrix}det(A) = \lambda_1 \cdot \lambda_2 \cdot \cdots \cdot \lambda_n \\ tr(A) = \lambda_1 + \lambda_2 + \cdots + \lambda_n\end{matrix}$$

Because I know $\lambda_1 = 1$:

$$ \begin{cases} \lambda_2 \cdot \lambda_3 = 4 \\ \lambda_2 + \lambda_3 = 5 \end{cases} \implies \lambda_{2,3}^2 - 5 \lambda_{2,3} + 4 = 0 \implies \lambda_2 = 1 ,\quad \lambda_3 = 4 $$

So the autovalues and the geometric/algebraic multiplicities are:

$$ \lambda_1 = 1 ,\quad \mu(1) = 2 \\ \lambda_2 = 4 ,\quad \mu(4) = 1 ,\quad m(4) = 1 $$

To find $m(1)$ I can use the following:

$$A \in\mathscr M_\mathbb R(n) \text{ is diagonalizable} \iff m(\lambda_1) + m(\lambda_2) + \cdots + m(\lambda_n) = n$$

So:

$$m(1) = n - m(4) = 3 - 1 = 2$$

  1. Find an orthogonal basis for every eigenspaces.

I suppose I have to use the Spectral Theorem, but I really don't know how to do.

UPDATE: I've tried to solve this question using the method suggested by Michael Seifert.

For the eigenvalue $\lambda_1 = 1$ I have:

$$ V_1 = Q - I_3 = \begin{pmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{pmatrix} $$

$$ dim(V_1) = 3 - rk(Q - I_3) = 3 - 1 = 2 $$

So the cartesian equation of $V_1$ is:

$$ x + y + z = 0 $$

I search for the parametric equations:

$$ \begin{cases} x = \alpha \\ y = \beta \\ z = - \alpha - \beta \end{cases} $$

So a basis for $V_1$ is the following:

$$ \mathscr B_{V_1} = \begin{Bmatrix}\begin{pmatrix}1\\0\\-1\end{pmatrix},\begin{pmatrix}0\\1\\-1\end{pmatrix}\end{Bmatrix} $$

Now I can apply the Gram-Schmidt process to find the solution. The same for $\lambda_2 = 4$.

  1. Does exists a solution of $X^T Q X = 0$ with $X \neq 0$? Why?

Also for this, I don't know how to find the answer.


My questions are the following:

  1. Are (1) and (2) right?
  2. Is there a faster way to solve (1) and (2)?
  3. How can I solve (3) and (4)?
2

There are 2 best solutions below

8
On BEST ANSWER

It's not necessarily faster, but another way to solve (2) involves using the determinant more directly. If $\lambda$ is an eigenvalue, we have $$ |Q - \lambda I_3| = \begin{vmatrix} 2 - \lambda & 1 & 1 \\ 1 & 2- \lambda & 1 \\ 1 & 1 & 2- \lambda \end{vmatrix} $$ $$ 0 = (2 - \lambda)\left( (2 - \lambda)^2 - 1) \right) - (2 - \lambda - 1) + (1 - (2 - \lambda)) $$ $$ 0 = 4 - 9 \lambda + 6 \lambda^2 - \lambda^3 $$ You've already been clued in that $\lambda = 1$ is a root of this polynomial, so you can factor out $(1 - \lambda)$ to get $$ 0 = (1 - \lambda)(4 - 5 \lambda + \lambda^2) $$ and the roots of the remaining factor are $\lambda = 1$ and $\lambda = 4$. In fact, this becomes $$ 0 = (1 - \lambda)^2 (4 - \lambda); $$ since 1 is a double root of the polynomial, it is an eigenvalue of $Q$ with multiplicity 2, while 4 is an eigenvalue with multiplicity 1.

As far as (3) goes, here's what I would do: if $\mathbf{v}$ is an eigenvector of $A$ with eigenvector $\lambda = 1$, then it must be the case that $$ (Q - I_3) \mathbf{v} = 0, $$ i.e., $\mathbf{v}$ must lie in the null space of $$ Q - I_3 = \begin{bmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{bmatrix} $$ You can then use the usual sorts of techniques to find the null vectors of this matrix, and use the Gram-Schmidt process (if necessary) to make the basis orthogonal. Similarly, finding the null space of $Q - 4 I_3$ will yield the eigenvector for $\lambda = 4$ (note that there will only be one up to scaling, since this eigenspace is one-dimensional.

Finally, for (4): suppose that $X$ existed such that $X^T Q X = 0$. One of the neat facts about eigenvectors of symmetric matrices is that they span the entire vector space and can be chosen to be orthogonal; so you must have $X = \alpha_1 X_{1} + \alpha_2 X_2 + \alpha_3 X_3$, where the $X_i$ vectors are all mutually orthogonal, and where the only way to get $X = 0$ is to have $\alpha_1 = \alpha_2 = \alpha_3 = 0$. Try plugging this in to the quantity $X^T Q X$ and see what it tells you.

0
On

For part 3 you want to find all vectors $(x,y,z)$ such that

$$x+y+z=0$$ that is your matrix for eigenvalue $1$ it is easy to see that the two independent solutions are $(1,0,-1)$ and $(0,1,-1)$ you then need to orthogonalize them. For the eigenvalue $4$, the matrix is $$\begin{pmatrix} -2&1&1\\1&-2&1\\1&1&-2\end{pmatrix}$$ this has rank $2$ and it gives the eigenvector $(1,1,1)$.

For the fourth part $Q$ is diagonalised by an orthogonal transfrom so it sufficed to solve this for a diagonal matrix, and it boils down to finding $x,y,z$ such that $4x^2+y^2+z^2=0$ which has only $x=y=z=0$ if you are working over the reals.