Defintion of Linear Dependence/Independence

125 Views Asked by At

Have been learning about linear dependence for the past week or so and I'm trying to get the concept down to a definition which can be easily understood.

If we have the set $s$:

$s = \{\begin{bmatrix}a_1\\b_1\end{bmatrix}, \begin{bmatrix}a_2\\b_2\end{bmatrix}\}$

We know that $s$ is linearly dependent if and only if:

$a_1(c_1) + a_2(c_2) = 0$

$b_1(c_1) + b_2(c_2) = 0$

... where $c_1$ and $c_2$ are not both equal to $0$.

Based on this fact, are the below definitions that I have perscribed for linear dependence/independence correct?

Linear Dependence:

If a set is linearly dependent, then this means that a vector in the set can be represented by some linear combination of the other vector(s) in the set.

Linear Independence:

If a set is linearly independent, then this means that any arbitrary vector in $\mathbb R^{n}$ can be represented by some linear combination of the vectors in the set.

3

There are 3 best solutions below

3
On

Linear Dependence:

If a set is linearly dependent, then this means that a vector in the set can be represented by some linear combination of the other vector(s) in the set.

Correct.

Linear Independence:

If a set is linearly independent, then this means that any arbitrary vector in Rn can be represented by some linear combination of the vectors in the set.

Incorrect. The set $$A=\left\{\begin{bmatrix}1\\1\end{bmatrix}\right\}$$ is a linearly independent set, however, the vector $\begin{bmatrix}0\\1\end{bmatrix}$ cannot be writen as a linear combination of vectors in $A$.


The concepts that are closely connected to linear independence, and also deal with what vectors can be written as what kind of linear combination, are the concepts of a basis and the concepts of a span.

The topic is a little too broad to cover in detail on this site, but the general idea is this:

  • A linear span of a set is the set of all linear combinations we can make from the set
  • A set is linearly independent if it is not linearly dependent
  • A set $S$ is a basis for vector space $V$ if the span of $S$ is $V$ and if $S$ is linearly independent.

Interesting things that follow from the definitions above include, but are not limited to:

  • If $B$ is a basis for $V$, then every element $v\in V$ has only one way in which it can be written as a linear combination of elements from $V$.
  • Each linearly independent set is a basis of the space that it spans
  • All bases of a vector space have equal size
3
On

The last one is false and the others are right. Even a single non-zero vector in $\mathbb R^{n}$ is linearly independent and you cannot write any vector as a linear combination of this vector. What is true is if you have $n$ linearly independent vectors $x_1,x_2,...,x_n$ in $\mathbb R^{n}$ then any vector can be written as a linear combination of these vectors.

0
On

Your extrapolation on linear independence cannot be right. Because if you consider $n$ linearly independent vectors and take one apart, the remaining $n-1$ ones cannot represent it, though they are linearly independent.

Even simpler: a singleton set is perforce linearly independent, but cannot generate the whole of $\mathbb R^n$.


Vectors are linearly independent iff the only linear combination that yields the zero vector is the trivial combination (all zero coefficients). In other words, vectors are independent if they are not… dependent.