"Does $A = \begin{bmatrix}1&2\\2&4\end{bmatrix}$ have a zero eigenvalue?"
Well, it would be a funny question to ask if the asker didn't state that he wants us to explain without computing the characteristic polynomial. I have no idea how to do this.
"Does $A = \begin{bmatrix}1&2\\2&4\end{bmatrix}$ have a zero eigenvalue?"
Well, it would be a funny question to ask if the asker didn't state that he wants us to explain without computing the characteristic polynomial. I have no idea how to do this.
On
This matrix has determinant $0$. So the matrix is non invertible, and hence must have $0$ as an eigenvalue. ($A-\lambda I$ is not invertible for $\lambda=0$).
On
No problem. Just go back to the definition.
$Ax = \lambda x$
Find a $\lambda$ and a nonzero x s.t. the above holds. There are in fact infinitely many since the matrix has a determinant of zero/is singular.
On
Im not sure how green exactly you are but the determinate is a very easy way to determine whether a matrix has a 0 eigenvalue. To take the determinant this is usually illustrated with the matrix below that i will call M.
$M = \begin{bmatrix}a&b\\c&d\end{bmatrix}$
Where the Det M = ad-cb if the Det M = 0 we know there is at least one eigenvalue that equals 0. I believe the characteristic polynomial hes refuring to is "$A = \begin{bmatrix}1&2\\2&4\end{bmatrix} \begin{bmatrix}x\\y\end{bmatrix}$
or x + 2y =0
and 2x + 4y =0
you can then use one of these equations to solve for x and sub it into the other ( this is a common grade 11/12 method that i believe hes trying to get you to avoid. with the characteristic polynomial comment.)
Lastly the First method you'd probably be taught in college (before you'd learn what a determinant was) is row operations in this case $R_2 - 2R_1$ ( $R_1$ = row 1 of the matrix ) yields the matrix $A = \begin{bmatrix}1&2\\0&0\end{bmatrix}$ now since $R_2$ is a row of all 0's we know we have a 0 eigenvalue. Which i believe most second year courses stop and simply dismiss this as unsolvable?
You know that $$2 = \dim \ker A + \dim {\rm col} \ A.$$
If $\dim {\rm col} \ A = 1$, our hypothesis, then $\dim \ker A = 1$, so we have non zero vectors in $\ker A = \ker(A - 0 \ {\rm Id})$. So yes, $0$ is an eigenvalue.