Proof for the linear dependence of vectors

47 Views Asked by At

Let there be a vector space V and a subset A={a1... ar} ⊂ V. Leit it also be the case that 1≤ i ≤ r. Show that the elements of A are linearly dependent, when there does exist an index i∈{1,...,r} and real numbers λj j∈{1,...,r}, j≠i so that the following is true:

(https://i.stack.imgur.com/i55Eh.jpg)

If somebody could help me out, I would be very happy. My guess is that I have to subtract ai from the sum of the other elements to get the zweo, but I have no idea how a formally correct answer might look like.

1

There are 1 best solutions below

0
On

It is essentially the definition of linearly dependency:

A set of vectors is said to be linearly dependent if at least one of the vectors in the set can be defined as a linear combination of the others