Linear Algebra: Rowspace, column space (Concept question)

680 Views Asked by At

freshman student here looking to clarify the concepts.

1.) Is the column space also the solution space of a matrix?

2.) Lets say a matrix is formed by a1,a2,a3,a4,a5. I find the rref form of this matrix; turns out a1,a2,a3 are linearly independent. Can it suffice to say that a1,a2,a3 are the basis for both solution space and column space of a matrix? Can this tell me anything about row space?

3.) What is the purpose of the row space?

4.) Is the basis for row space also the basis for the solution space in a square matrix? Here on out it becomes increasingly difficult for me to visualize.

5.) Apart from dimension theorem, what other purpose does rank actually serve? So I am familiar with the formulas regarding rank; rank <= min(n,m) or rank = min(dimension(rowspace),dimension(colspace)), but I am not actually sure I understand the implications of this.

6.) If dim(rowspace)<dim(colspace), can this tell me anything about the solution space or any implications (apart from implications on the rank)?

Sorry for the long-winded question. Hope anyone can provide insight on anything above, for me and my friends.

2

There are 2 best solutions below

1
On BEST ANSWER

Let's go through these one by one.

  1. Yes. The row space of a matrix $A$ can be thought of as all of the vectors $\vec b$ such that $A\vec x=\vec b$ has a solution. This is because multiplying a matrix by a vector essentially takes all the column vectors of the matrix and adds them up in a linear combination. The column space is just all the linear combinations of the columns, so the column space and the solution space are identical.
  2. Yes, as long as $a_4$ and $a_5$ are not also independent. In general, in a matrix of rank $n$, any $n$ linearly independent column vectors will be a basis for the column space.
  3. This is a bit of an open-ended question, but there's a few important things to know about the row space: first, the row space and the column space have the same dimension, $\text{rank }A$. This is not at all obvious without recognizing the second fact: every vector in $\text{Row }A$ is orthogonal to every vector in $\text{Nul }A$. (If you haven't learned that in class yet, take ten to twenty minutes playing with matrices of different sizes, and see if you can figure out why that's true.) It's also important to realize that the row space of $A$ is the column space of $A^T$, which means that a matrix and its transpose have the same rank.
  4. No. As an extreme example, consider $\begin{bmatrix}1&0&0\\1&0&0\\1&0&0\end{bmatrix}$. The row space is spanned by $\begin{bmatrix}1\\0\\0\end{bmatrix}$ while the column space is spanned by $\begin{bmatrix}1\\1\\1\end{bmatrix}$. However, there's a few things you should notice about these spaces: first, they have the same dimension. Second, if we perform Gauss-Jordan elimination on our matrix, the column space will change, but the row space (and therefore the null space, by what we talked about in #3) does not. This is true for all matrices, not just square ones. (Note that, in this particular case, the row space and the column space are identical after Gauss-Jordan. This is not true in general.)
  5. Rank is crucially important because of the rank theorem, as well as because the rank is equal to the dimension of the column space and the dimension of the row space. Finding the rank allows you to determine invertibility, and after you get a feel for rank as "the dimension of the range of $A$" you can easily see why inequalities like $\text{rank }(AB)\leq\min(\text{rank }A,\text{rank }B)$ are true. From there, you can look at things like matrices $A$ where $A^n$ eventually becomes the zero matrix, and you can analyze how rank affects transformations. There's also the ever-important Rank Theorem, that in an $m\times n$ matrix $A$, $\text{rank }A+\text{dim Nul }A=n$.
  6. The dimension of your row space and the dimension of your column space have to be equal. Both of them are equal to the rank. If you get different dimensions, go back and check your work. I'll leave you with, once again, the fundamental property of rank:

$$\text{rank }A=\text{dim Col }A=\text{dim Row }A$$

0
On

The column space is range or the image of the transformation. This is analogous to the range of single variable functions. i.e. the set of all possible outputs of $Ax.$

If a $5\times 5$ matrix has a rank of 3, then you can choose any set of 3 independent column vectors as a basis of the column space.

The row-space is a little bit less intuitive to me. It can be thought a basis of set of all linear equations consistent with the matrix. Or, you could think of it as the basis of the set of vectors that is orthogonal to the kernel of the matrix.

The rank of a matrix tells me how much information is getting compressed by this transformation. If the matrix is full rank, it isn't getting compressed at all. If it a rank 2 matrix, then the output maps onto a plane.

The rank nullity theorem says that the Dimension of the range plus the dimension of the kernel equals the dimension of the domain.

The dimension of the row space always equals the dimension of the column space which equals the rank of the matrix.

Does this help?