i have very recently started studying linear algebra, and I have the following doubt. I know that the image of a linear transformation T is span{T(v1),T(v2),...T(vn)}. Now, the rank of a linear transformation is the dimension of that span. I understand that it's equivalent to say that [1] : the rank of the matrix associated with that transformation is the dimension of the span of the columns of the matrix. What I don't understand is why I usually find on textbooks that [2] : the rank of a matrix is the number of linearly independent columns of the matrix. How can we go from [1] to [2] ?
Confusion about rank, matrix, linear transformations
100 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
One explanation is that this is a consequence of the dimension theorem, which states that any basis for the image of the transformation $T$ will have the same number of vectors. With that in mind, let $S = \{T(v_1),\dots,T(v_n)\}$. I claim that the following is true as a consequence of the dimension theorem:
A linearly independent subset $S' = \{T(v_{i_1}),\dots,T(v_{i_k})\} \subsetneq S$ can be extended to a larger linearly independent of $S$ if and only if we have $\operatorname{span}(S) \subsetneq \operatorname{span}(S')$.
(Actually, this weaker statement is not too difficult to prove without using the dimension theorem).
With that in mind, let $d = \dim \operatorname{span}(S)$. There will necessarily exist a linearly independent set $S' \subseteq S$ with $d$ elements (why?). In fact, this set $S'$ of $d$ elements must form a basis for the image of $T$, i.e. $\operatorname{span}(S)$ (why?). Moreover, any set with more than $d$ elements cannot be a basis for the image of $T$ (why?).
So, it is indeed the case that the dimension $d$ of the image of $T$ is equal to the maximal size among the linearly independent subsets of $S$.
Let's say the columns of an $m\times n$ matrix $A$ are given as $\xi_1,\dots, \xi_n \in \Bbb{F}^m$. The span of its columns is the vector subspace $S=\text{span}\{\xi_1,\dots,\xi_n\}=\left\{\sum_{i=1}^nc_i\xi_i\,| \, c_1,\dots, c_n\in \Bbb{F}\right\}$. The dimension of a vector space is by definition the size of one (hence any) basis. Now, what is a basis? A basis is a set of vectors which are linearly independent and which span the vector space.
In finite dimensions, if you have a spanning set (like $\{\xi_1,\dots, \xi_n\}$) then some subset of it must be linearly independent and it has the same span (this should be one of the first few theorems proven in any linear algebra book). In other words, some subset $\beta\subseteq \{\xi_1,\dots, \xi_n\}$ is actually a basis for $S$. So, $\dim S = |\beta|$ is the number of linearly independent columns of the matrix.