I am using the textbook Introduction to Vectors and Tensors by Bowen and Wang to self-study linear and multilinear algebra. I was asked to prove one of the theorems left to the readers, but first I must explain a little bit of background:
A projection operator (more succinctly called a projection) on a vector space $V$ is an endomorphism $P: V \mapsto V$ with the property that $P^2(\vec{v}) = P(\vec{v}), \forall \vec{v} \in V.$
$R(P)$ and $K(P)$ denote the range and kernel of a projection $P,$ respectively. $I:V \mapsto V$ is the identity automorphism for $V.$
I want to prove the following, keeping in mind that we are only dealing with finite-dimensional vector spaces:
Theorem: If $P_k, k \in \{1, 2, .... , N\}$ are projections on $V$ with the following properties:
$(1)$ $P_k^2 = P_k$, $\forall k \in \{1, 2, ..., N\}$
$(2)$ $P_kP_q = \vec{0}$, $k \neq q$
$(3)$ $I = \sum_{k=1}^{N}P_k$
then $V = R(P_1)\bigoplus R(P_2)\bigoplus ... \bigoplus R(P_N)$, where $\bigoplus$ is the direct sum operation.
The proof consists of two parts: first, proving that
$(a)$ $V = \sum_{i=1}^{N}R(P_i)$, and then
$(b)$ proving that the direct sum is well-defined in this case.
My proof so far is as follows:
$(a)$ Suppose $\vec{v} \in V.$ Then by property $(3)$ and the definition of the identity automorphism, $\vec{v} = I(\vec{v}) = \sum_{k=1}^{N}P_k(\vec{v})$. It suffices to notice that each of the $P_k \in R(P_k)$ so every $\vec{v}$ can be decomposed into elements of the ranges of the projections as required.
$(b)$ I know a theorem that says that the direct sum of a finite number of subspaces $V_1 \bigoplus V_2 \bigoplus ...\bigoplus V_Q$ of $V$ must satisfy the following property:
$\forall R \in \{1, 2, ..., Q\},$ $V_R \cap \sum_{k=1}^{R-1}V_k +V_R \cap \sum_{k=R+1}^{Q} V_k = \{\vec{0}\}$.
Suppose $\vec{v} \in R(P_k) \cap \sum_{i=1}^{k-1}R(P_i) + R(P_k) \cap \sum_{i=k+1}^{N}R(P_i)$.
Then $\exists$ $\vec{v_1} \in R(P_k) \cap \sum_{i=1}^{k-1}R(P_i)$ and $\vec{v_2} \in R(P_k) \cap \sum_{i=k+1}^{N}R(P_i)$ such that $\vec{v} = \vec{v_1} + \vec{v_2}$ since the expression above is a sum of subspaces.
Since $\vec{v_1} \in \sum_{i=1}^{k-1}R(P_i)$, $\exists k-1$ vectors $\vec{u_1}, \vec{u_2}, ..., \vec{u_{k-1}}$ such that:
$\vec{v_1} = \sum_{i=1}^{k-1}P_{i}(\vec{u_i})$.
By property $(3)$, $I(\vec{v_1}) = \vec{v_1} = \sum_{k=1}^{N}P_{k}\sum_{i=1}^{k-1}P_{i}(\vec{u_i})$
$=\sum_{k=1}^{N}\sum_{i=1}^{k-1}P_{k}(P_{i}(\vec{u_i}))$ by linearity.
By property $(2)$, this sums to $\vec{0}$, since $i \neq k$. So we have $\vec{v_1} = \vec{0}.$
By a similar argument, $\vec{v_2} = \vec{0}.$
Therefore, $\vec{v} = \vec{v_1} + \vec{v_2} = \vec{0}$, so the property $(b)$ is satisfied.
This concludes my proof. The only problem that I have with it is that I did not use property $(1)$ in my proof anywhere. Can somebody please tell me whether or not I am correct? And if I am wrong, can you please point out my fallacy?
Thank you so much for reading and helping.
Conditions (2) and (3) do indeed imply (1), as noted in the comments. So the statement of the theorem is indeed awkward.
Suppose conditions (2) and (3) hold, and consider the case $N=2$, for simplicity.
Then $P_1P_2=P_2P_1=0$ and $I=P_1+P_2$. Multiply the second equation by $P_1$. What do you find?