I'm studying elementary linear algebra right now, and the current section is on linear independence. As I create matrices from the vectors and row reduce them in a calculator, I get various results.
Some row reduced forms give nonzero integers on the right hand side except the bottom entry, which I understand to be linear dependence. Other forms have all zeros on the right hand side, which I understand to be linear independence. Another instance is when the right hand side is all zeros except the one on the bottom. Normally, I know that this means no solution, but my professor has stated, "The vector is not in the span of the other vectors."
I want to know the mathematical difference between linear independence and "not being in the span." If the vector is not in the span, does that mean it's not linearly dependent? If it cannot be linearly dependent, why not just call it linearly independent since the linear combination will never equal 0 unless every coefficient is 0?
If there are any errors in my understanding or assumptions, please correct me and shed light on my ignorant mind as the final exam approaches.
Another question is, can linearly independent vectors be written as a linear combination? I thought that the coefficients cannot all be 0, but my textbook seems to think they can be. It claims that the zero vector is a linear combination of two vectors, its linear combination that can never be 0 unless every coefficient is 0.
Sorry for another edit. The questions are just piling up. My textbook says that a zero vector is a linear combination of some vectors, but another nonzero vector is not a linear combination. Why can't every coefficient be 0 to make the nonzero vector a linear combination as well?
Consider the set of vectors $\{\mathbf v_1, \mathbf v_2, \dots, \mathbf v_k\}$ in the vector space $V$ over the field $F$ (the vector space might be $\Bbb R^3$ for instance and the field (of scalars) might be $\Bbb R$). Here are some definitions:
Now let's show that a linearly dependent set has at least one vector which is a linear combination of the others. Let $\mathbf a,\mathbf b,\mathbf c \in \Bbb R^3$ be a set of linearly dependent vectors. Then by definition, the equation $$x\mathbf a+y\mathbf b+z\mathbf c=\mathbf 0$$ for scalars $x,y,z$ has more than one solution ($x=y=z=0$ is definitely a solution, but it's not the only one). Thus at least one of the scalars $x,y,z$ are nonzero. WLOG let's say it's $x$. Then we can rearrange this equation by subtracting all of the vectors except $x\mathbf a$ on both sides. $$x\mathbf a=-y\mathbf b-z\mathbf c$$ Now because $x \ne 0$, we can divide it on both sides to get $$\mathbf a=-\frac yx\mathbf b -\frac zx\mathbf c$$ Thus $\mathbf a$ is a linear combination of $\mathbf b$ and $\mathbf c$. You can see why this would fail in the linearly independent case -- you wouldn't be able to divide out any of the coefficients because they are all zero.
Now let's consider the set of vectors $\{\mathbf 0\}$. That is the set only containing the zero vector. Is this set linearly independent or linearly dependent? It is linearly dependent because $x\mathbf 0=\mathbf 0$ has infinitely many solutions. Likewise, any set which contains the zero vector will be a linearly dependent set (confirm this for yourself).
I now claim that the zero vector in a vector space $V$ is a linear combination of any non-empty set of of vectors in $V$. Can you see why that must be true?
Does this answer you questions or is there something else I need to hit on?