Decomposition of complex vector space

274 Views Asked by At

I'm new to Math Stack Exchange, so apologies in advance if this question improperly formatted.

I'm working on the following linear algebra problem and I'm having some difficulties completing the proof.

Let $ V $ be a complex vector space and $ T \in End(V) $ such that $ T^4 = id_V $. Show that V decomposes into a direct sum of four subspaces.

So far, I've chosen $$ V_+ = \{v \in V\ |\ T(v) = v \}, \quad V_- = \{v \in V\ |\ T(v) = -v \}, \\ V_i = \{v \in V\ |\ T(v) = iv \}, \quad \textrm{and} \quad V_{-i} = \{v \in V\ |\ T(v) = -iv \} $$ as my subspaces.

Next, to show that the sum of these subspaces is direct, it is sufficient to show that for any $ v_+ \in V_+ $, $ v_- \in V_- $, $ v_i \in V_i $, and $ v_{-i} \in V_{-i} $, $ v_+ + v_- + v_i + v_{-i} = \vec{0} \Rightarrow v_+, v_-, v_i, v_{-i} = \vec{0} $. If necessary, I can edit in my proof for this statement.

Finally, I need to show that $ V = V_+ + V_- + V_i + V_{-i} $. Given some $ v \in V $, I need to choose $ v_+ \in V_+ $, $ v_- \in V_- $, $ v_i \in V_i $, and $ v_{-i} \in V_{-i} $ such that $ v = v_+ + v_- + v_i + v_{-i} $. This is where I'm struggling.

Having done a similar question earlier (with $ T^2 = id $ and with only two subspaces), I know that, roughly, they will look something like $ v_+ = \frac{1}{4}(v + T^3(v)) $, $ v_- = \frac{1}{4}(v - T^3(v)) $, $ v_i = \frac{1}{4}(v + iT^3(v)) $, and $ v_{-i} = \frac{1}{4}(v - iT^3(v)) $. I know that these are wrong because I can't show that $ v_+ \in V_+ $, $ v_- \in V_- $, etc. I've tried replacing $ T^3 $ with $ T^2 $ and I've also tried replacing $ T $ with $ T^2 $ in my original choice of subspaces but was unsuccessful. Any pointers or suggestions on how to complete this proof would be greatly appreciated.

1

There are 1 best solutions below

2
On BEST ANSWER

Hint Consider the action of general linear combinations $$S := a_0 \operatorname{id} + a_1 T + a_2 T^2 + a_3 T^3$$ on vectors in each vector space. For example, what combination $S_+$ gives $S_+(v_+) = v_+$ for all $v_+ \in V_+$ but $S(v_{\bullet}) = 0$ for $v_{\bullet}$ in any other other subspace?

(Notice that this was effectively what you did for the case of operators satisfying $T^2 = \operatorname{id}$; in that case the general linear combination only had two terms, $b_0 \operatorname{id} + b_1 T$.)