A basic doubt on linear dependence and basis vectors

316 Views Asked by At

I see that linear independence/dependence is defined for a finite set of vectors in books. But, basis vectors are always independent and they need not be finite. Is the definition consistent ?

3

There are 3 best solutions below

1
On BEST ANSWER

Here is a definition for linear dependence from the book Linear Algebra by Friedberg, Insel and Spence (using the fourth edition here):

A subset $S$ of a vector space $V$ is called linearly dependent if there exists a finite number of distinct vectors $u_1,u_2,\ldots,u_n$in $S$ and scalars $a_1,a_2,\ldots,a_n$, not all zero, such that\begin{equation}a_1u_1+a_2u_2+\cdots+a_nu_n=0.\end{equation} In this case we also say that the vectors of $S$ are linearly dependent.

The definition of linearly independent is then simply given as(in the same book): A subset $S$ of a vector space that is not linearly dependent is called linearly independent.

So from these definitions it does not seem that the subset $S$ is required to be finite. The simplest example I can think of is the vector space of all polynomials with coefficients from a field. A basis for this space is $\{1,x,x^2,\ldots\}$, which is linearly independent and infinite.

1
On

For an infinite set, we can define it to be independent iff every finite subset of it is independent.

0
On

A linear combination of a familiy $\{v_i\}_{i\in I}$ of vectors is an expression of the form $\sum_{i\in I} c_iv_i$ where all but finitely many of the $c_i$ are zero. Without this restriction, it would not even be possible to defeine the "sum" in general. Following this, $\{v_i\}_{i\in I}$ is linearely independant if the only linear combination producing $0$ is the trivial one. And $\{v_i\}_{i\in I}$ generates $V$ if each $v\in V$ can be written as linear combination. It is already the definition of linear combination that introduces the finiteness restrictions.