Let there be a vector space V and a subset A={a1... ar} ⊂ V. Leit it also be the case that 1≤ i ≤ r. Show that the elements of A are linearly dependent, when there does exist an index i∈{1,...,r} and real numbers λj j∈{1,...,r}, j≠i so that the following is true:
(https://i.stack.imgur.com/i55Eh.jpg)
If somebody could help me out, I would be very happy. My guess is that I have to subtract ai from the sum of the other elements to get the zweo, but I have no idea how a formally correct answer might look like.
It is essentially the definition of linearly dependency: