Show that the sum of a finite collection of vectors is independent of how they are added

30 Views Asked by At

Here is the proof outlined in the book (via induction).

Suppose we have two vectors $\alpha_a$ and $\alpha_b$. Then by the commutative law of addition, $$\alpha_a + \alpha_b = \alpha_b + \alpha_a$$ shows the identity of all possible sums.

Suppose then that the assertion is true for index sets having fewer than $n$ elements. Now consider a collection $\{\alpha_i : I\in I\}$ with $n$ members. Let $\beta$ and $\gamma$ be the sum of these vectors computed in two ways.

In the computation of $\beta$ there was a last addition performed such that $$\beta = (\sum_{I \in J_1}\alpha_i) + (\sum_{I \in J_2}\alpha_i)$$ where $\{J_1, J_2 \}$ partitions $I$ and where we can write these two partial sums without showing how they were formed, since by our inductive hypothesis all possible ways of adding them give the same result. Similarly, $\gamma = (\sum_{I \in K_1}\alpha_i) + (\sum_{I \in K_2}\alpha_i)$.

Now set $$L_{jk} = J_j \cap K_k ~~~\textrm{and}~~~ \zeta_{jk} = \sum_{I\in L_{jk}} \alpha_i$$

where it is understood that $\zeta_{jk} = 0$ if $L_{jk}$ is empty. Then $\sum_i = \zeta_{11} + \zeta_{12}$ by the inductive hypothesis, and similarly for the other three sums. Thus

$$\beta = (\zeta_{11} + \zeta_{12}) + (\zeta_{21} + \zeta_{22}) = (\zeta_{11} + \zeta_{21}) + (\zeta_{12} + \zeta_{22})$$ and the proof is complete.


  • I am getting stuck when they start talking about $L_{jk}$ and $\zeta_{jk}$. I also can't think of what elements $j\in J$ or $k\in K$ would be in this context.

  • I also don't see the reason where the connection is after showing how $\beta$ is computed and what foundation it is supposed to build.