In the inner product space $\mathbb{C}^{2}$ with its standard inner product, let $$ T\begin{pmatrix} x\\y \end{pmatrix} = \begin{pmatrix} 3x+4y\\-4x+3y \end{pmatrix} $$ a linear operator. Express the orthogonal projections on $T$'s eigenspaces as polynomials in $T$.
What do they mean by expressing the projections as polynomials in $T$?
What I did so far:
I picked the standard orthonormal basis $ B = \left ( \begin{pmatrix} 1\\0 \end{pmatrix} ,\; \begin{pmatrix} 0\\1 \end{pmatrix} \right ) $ for which $ \left [ T \right ]_{B} = \begin{bmatrix} 3 & 4\\-4 & 3 \end{bmatrix} $.
The eigenspaces are $$ V_{3+4i} = Span\left \{ \begin{pmatrix} 1 \\ i\end{pmatrix} \right \} ,\; V_{3-4i} = Span\left \{ \begin{pmatrix} i \\ 1\end{pmatrix} \right \} $$
After normalizing the above vectors, I used the orthogonal projection formula $ P_{U}(v) = \sum_{i=1}^{n} \left \langle v, u_{i} \right \rangle u_{i} $ and found that $$ P_{V_{3+4i}}(v) = \frac{1}{2}\begin{pmatrix} x-yi\\y+xi \end{pmatrix} $$ (I only did the calculation for the first eigenspace for now)
So, how do I express this as a polynomial in $T$?
Thanks.
I'll answer the question in a more general context because once we identify the important ingredients, the solution becomes almost obvious.
Fix a a finite dimensional vector space $V$ and let $T \colon V \rightarrow V$ be a diagonalizable operator with two distinct eigenvalues $\lambda_1, \lambda_2$. Write $V_i = \ker(T - \lambda_i I)$ so $V = V_1 \oplus V_2$ and denote by $P_i \colon V \rightarrow V$ the projection of $V$ to $V_i$ with respect to this direct sum decomposition. Thus, every vector $v \in V$ can be written as $v = v_1 + v_2$ and $Tv_i = \lambda_i v_i$ and $Pv = v_i$. How can we express the $v_i$'s using $T$? Consider for example $v_1$:
$$ v = v_1 + v_2, Tv = \lambda_1 v_1 + \lambda_2 v_2 \implies Tv - \lambda_2 v = (\lambda_1 - \lambda_2) v_1 \implies \\ v_1 = \frac{1}{\lambda_1 - \lambda_2} (Tv - \lambda_2 v) = \frac{1}{\lambda_1 - \lambda_2} (T - \lambda_2 I)(v).$$
Thus, if we let $p_1(x) = \frac{1}{\lambda_1 - \lambda_2} (x - \lambda_2)$ we get $P_1 = p_1(T)$. Note that $p_1$ is a linear polynomial that satisfies $p_1(\lambda_1) = 1$ and $p_1(\lambda_2) = 0$ (and this determines $p_1$ uniquely). The important observation here is that if $p$ is a polynomial and $v$ is an eigenvector of $T$ corresponding to the eigenvalue $\lambda$ then $v$ is also an eigenvector of $p(T)$ and $p(T)v = p(\lambda)v$. Thus, if we want to construct a polynomial $p$ such that $p(T)$ acts on $v = v_1 + v_2$ by throwing away the $v_2$ part, we need a polynomial that satisfies $p(\lambda_1) = 1$ and $p(\lambda_2) = 0$. There are many such polynomials but the one that has the minimal degree is the linear one we found explicitly. Similarly, if we take $p_2(x) = \frac{1}{\lambda_2 - \lambda_1}(x - \lambda_1)$ we get $P_2 = p_2(T)$. This idea generalizes easily to the situation $T$ has more than two eigenvalues.
If $V$ is an inner product space and the operator is orthogonally diagonalizable, the direct sum decomposition is orthogonal ($V_1 \perp V_2$) and the projections $P_i$ are orthogonal projections.
In your case, $\lambda_1 = 3 + 4i, \lambda_2 = 3 - 4i$ and so
$$p_1(x) = \frac{1}{3 + 4i - (3 - 4i)} (x - (3 - 4i)) = -\frac{i}{8} (x - (3 - 4i)) = \frac{1}{8} (4 + 3i - ix) $$
and indeed
$$ [p_1(T)]_B = p_1([T]_B) = \frac{1}{8} ((4 + 3i)I - i[T]_B) = \frac{1}{8} \left( \begin{pmatrix} 4 + 3i & 0 \\ 0 & 4 + 3i \end{pmatrix} - \begin{pmatrix} 3i & 4i \\ -4i & 3i \end{pmatrix} \right) = \\ \frac{1}{8} \begin{pmatrix} 4 & -4i \\ 4i & 4 \end{pmatrix} = \frac{1}{2} \begin{pmatrix} 1 & -i \\ i & 1 \end{pmatrix}$$
which is consistent with the projection you got.