I have run into a confusing problem in my linear algebra class:
Let $A$ be a real $2 \times 2$ matrix with a complex eigenvalue $\lambda = a-bi (b\ne 0)$ and an associated eigenvector $V$ in $\mathbb{C}^2$.
Show that $A(\Re V)= a\Re V + b\Im V$ and $A(\Im v) = -b\Re V + a\Im V$.
I suspect that I should solve it using this result:

However, I don't know how to invert a P matrix that is defined as $P = [\Re V, \Im V]$, nor how to multiply this $2\times 2$ matrix A by $[\Re V, \Im V]$ when the values aren't given. It would be much appreciated if I could get some direction on this problem and/or a solution. Thank you!
It is in fact very straightforward; you don't need the equation
$A = PCP^{-1}, \; \text{etc}; \tag 0$
instead, we may work directly from the definition
$V = \Re V + i\Im V; \tag 1$
we thus have, since $V$ is an eigenvector for eigenvalue $a - bi$,
$A\Re V + iA\Im V = A(\Re V + i\Im V) = AV$ $= (a - bi)(\Re V + i \Im V) = (a\Re V + b \Im V) + i(a \Im V - b\Re V); \tag 2$
equating real and imaginary parts, using the fact that $A$, $\Re V$ and $\Im V$ are real:
$A\Re V = a\Re V + b\Im V, \tag 3$
$A\Im V = a\Im V - b\Re V, \tag 4$
which was to be proved.