Sum of two vectors is the vector that has components equal to sum of components

424 Views Asked by At

Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?

2

There are 2 best solutions below

7
On

Let $v,\,w$ be vectors, and write $v=\sum_i (v\cdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=\sum_i ((v+w)\cdot e_i)e_i=\sum_i (v\cdot e_i+w\cdot e_i)e_i=\sum_i (v\cdot e_i)e_i+\sum_i (w\cdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.

Edit: we can do without dot products as long as the components $v_i$ satisfy $v=\sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$\sum_i (v+w)_ie_i=v+w=\sum_i v_ie_i+\sum_i w_i e_i=\sum_i (v_i+w_i)e_i\implies (v+w)_i=v_i+w_i.$$

0
On

What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that: $$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$

So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.

Then everything should fall out.