I'm wondering if there's a fairly easy algorithm by which one can, by hand, find eigenvectors corresponding to complex eigenvalues for small matrices.
Of course, one can always row reduce, but it can get ugly pretty quickly. This is mainly for students who are a little uneasy with complex number arithmetic.
The purpose would be for a linear algebra course, so I'm looking for something pretty straightforward and easy to explain.
Should I tell them to suck it up and learn how to deal with complex numbers? Or is there a nice algorithm that involves only real-number arithmetic which is feasible for, say, 2x2, 3x3, and 4x4 computations?
If you want to solve for real matrices without leaving $\mathbb R$, you can always use Weil restriction: write the eigenvalue $\lambda = \lambda_0 + i \lambda_1$ and the eigenvector $x = x_0 + i x_1$, and solve for $$ A x_0 = \lambda_0 x_0 - \lambda_1 x_1, \quad A x_1 = \lambda_0 x_1 + \lambda_1 x_0,$$ which amounts to finding an element in the kernel of $$\begin{pmatrix} A -\lambda_0 & \lambda_1\\ -\lambda_1 & A-\lambda_0\end{pmatrix}.$$ (You will have identified the standard embedding of $\mathbb C$ in $\mathbb R^{2\times 2}$ by the matrix $I = \tiny \begin{pmatrix} 0&1\\-1&0\end{pmatrix}$. The big matrix here is just $A \otimes (\lambda_0 + \lambda_1 I)$.)
The morale of this is: the best solution is “suck it up and learn how to deal with complex numbers”. Actually doing this for a few $2\times 2$ matrices should convince them of it.