So I have the system $$ X' =\begin{pmatrix} 0 & 1 \\ -k & -b \end{pmatrix} X $$
where I assume $$ 0 \le b < 2 \sqrt{k} $$
which results in one complex eigenvalue $$ \lambda = (1/2) (-b + i\sqrt{b^2 - 4k}) $$ However, when I try to row-reduce to get the corresponding eigenvector, I keep getting the identity matrix, meaning my eigenvector is $$ (0, 0) $$ Is this correct, because I've been pouring over my steps and can't find any errors. However this would mean the solution is $$ X = \begin{pmatrix} 0\\ 0\end{pmatrix} $$
Thanks in advance
It sounds like you are row-reducing the original matrix. You need to row-reduce the matrix $$\begin{pmatrix} 0 & 1 \\ -k & -b \end{pmatrix} - \lambda I$$ - the null space of that difference is the set of eigenvalues you want.