Suppose $A$ is a $2\times 2$ matrix with the complex eigenvalues $\lambda = \alpha \pm i \beta$.
I need to show that $A = V \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}V^{-1}$, where $V$ is an invertible matrix with columns $v_{1}$ and $v_{2}$. Then, I need to show that $v_{1}+iv_{2}$ is an eigenvector of $\lambda = \alpha + i\beta$.
First of all, this is for a differential equations course, and it has been a very long time since I've done any serious linear algebra. I think what this problem is saying is that I need to show that $A$ is similar to a matrix with entries the $\pm$ real and imaginary parts of its eigenvalues, but I'm not sure how to do that.
Secondly, how do I show that $v_{1}+iv_{2}$ is an eigenvalue of the $\alpha + i\beta$ eigenvalue without explicitly knowing what $v_{1}$ and $v_{2}$ are? (Or will I know? I'm very confused).
I am in a bit over my head with this problem and could really use some guidance. I thank you for your time and patience.
Assume that $\beta\ne 0$, otherwise we’re dealing with a repeated real eigenvalue, which requires a different analysis. A $2\times2$ matrix with eigenvalues $\alpha\pm i\beta$ has characteristic equation $(\lambda-\alpha)^2+\beta^2=0$. Let $B=A-\alpha I$. By the Cayley-Hamilton theorem, $(A-\alpha I)^2+\beta^2I=0$, therefore $B^2=-\beta^2I$. Let $\mathbf v_1$ be any nonzero real vector and define $\mathbf v_2 = -\frac1\beta B\mathbf v_1$. Then $$B\mathbf v_2 = \frac1\beta B^2\mathbf v_1 = \beta\mathbf v_1$$ and by definition $$B\mathbf v_1=-\beta\mathbf v_2.$$ Let $V = \begin{bmatrix}\mathbf v_1&\mathbf v_2\end{bmatrix}$. Then $$BV\begin{bmatrix}1\\0\end{bmatrix} = B\mathbf v_1 = -\beta\mathbf v_2 = -\beta V\begin{bmatrix}0\\1\end{bmatrix}$$ and $$BV\begin{bmatrix}0\\1\end{bmatrix} = B\mathbf v_2 = \beta\mathbf v_1 = \beta V\begin{bmatrix}1\\0\end{bmatrix}$$ therefore $$BV = V\begin{bmatrix}0&\beta\\ -\beta&0\end{bmatrix}.$$ Finally, $$A = \alpha I+B = \alpha I+V\begin{bmatrix}0&\beta\\-\beta&0\end{bmatrix}V^{-1} = V\begin{bmatrix}\alpha&\beta\\-\beta&\alpha\end{bmatrix}V^{-1}.$$
Proving the second part is a matter of multiplying out $A(\mathbf v_1+i\mathbf v_2) = (\alpha I+B)(\mathbf v_1+i\mathbf v_2)$ for this choice of $\mathbf v_1$ and $\mathbf v_2$ and doing a bit of straightforward algebra.