In a finite dimensional vector space $V$, a subset $S \subseteq V$ is said to be linearly dependent if there is a nontrivial solution $(a_i)$ to the equation
$$ \sum_{\mathbf{v_i}\in S}a_i\mathbf{v_i} = \mathbf{0} $$
An equivalent characterisation is that a set $S\subseteq V$ is linearly dependent if there is some $\mathbf{v}\in S$ such that
$$ Span(S) = Span(S/\{\mathbf{v}\}) $$ This allows one to prove that if the cardinality of the set is greater than the dimension of the space spanned by the set, then the set must be linearly dependent:
$$ |S| > dim(span(S)) $$
But the span of $S$ is nothing but the direct sum of the subspaces spanned by the elements of $S$.
This motivates the question, is there an analogous concept to linear independence but for subspaces instead of simply vectors?
Define by the $span$ of a set of subspaces the direct sum of all those subspaces.
Specifically I want to know if we have a set of subspaces $\mathcal{S}$ such that
$|\mathcal{S}|>dim(span(\mathcal{S}))$
Do we also necessarily have that there is some $T\in \mathcal{S}$ such that
$ span(\mathcal{S}) = span(\mathcal{S}/\{T\}) $
The context of this question is that if this second equivalence were true, it would allow for a fairly elegant proof of an Olympiad problem, co concerning sequences of square matrices that have pairwise $0$ products, but where the square of any matrix is non $0$. Initially one might think that this problem is trivially true since one can associate a subspace with a matrix, but linear combinations of these matrices are not the same as direct sums of the corresponding subspaces.
First, I would advise using the existing term "direct sum" rather than overloading a different term.
I believe the answer to your question is yes. Hint: prove the contrapositive by induction on $\#S$.