How many solutions are there if we linearly combine a set of linearly dependent vectors to produce another vector?

947 Views Asked by At

In the book I am reading, it says this about linearly independent vectors. It doesn't mention anything about linearly dependent vectors. Is it safe to assume that for a set of linearly dependent vectors, there are an infinite number of sets of coefficients that can produce a vector not in the set? Or would it just be more than one way?

enter image description here

1

There are 1 best solutions below

2
On BEST ANSWER

If $a_1,\dots, a_n$ are linearly dependant and $x$ is a linear combination of these vectors, then there are infinitely many such linear combinations. To see this, since these vectors are linearly dependant there are coefficients $\alpha_1,\dots,\alpha_n$ such that:

$ \alpha_1a_1 + \dots + \alpha_n a_n = 0 ~.$

Note that the above equation is true also if we multiply all the coefficients by some scalar $\alpha\in\mathbb{R}$ (this is just multiplying both sides of the equation by $\alpha$), so there are infinitely many ways to represent the zero vector as a linear combination of those vectors.

Now take some linear combination of $a_1,\dots,a_n$ that equal to $x$ (which exists by the assumption) and add the zero vector to it, which is also a linear combination of those vectors. Since there infinitely many ways to represent the zero vector as a linear combination of those vectors, then there are also infinitely many ways to represent the vector $x$ as a linear combination of those vectors.

This question is actually the same as asking how many solutions the following problem has: $Ax=y$ where $A$ is a matrix with columns $a_1,\dots,a_n$. The only possible answers are that either there are no solutions, there is a unique solution or there are infinitely many solutions.