Interpretation of the Rows of a Matrix (against a Finite Dimensional Transformation)

380 Views Asked by At

Consider the following snippet from a Wikipedia article on Matrices, which gives a nice interpretation of the columns of a matrix:

enter image description here

Question: Using this notation, is there some analogous interpretation of the rows of an $m \times n$ matrix $\mathbf{A}$?

Attempt: Let the rows of the $\mathbf{A}$ be denoted $r_1, \ldots , r_m \in \mathbf{R}^n$. Do we have that $\{f(r_1), \ldots, f(r_m)\}$ denotes a basis for $\mathbf{R}^m$?

2

There are 2 best solutions below

0
On

First, a bit of nit-picking. The rows of $\mathbf A$ are elements of $\mathbb R^n$, but the domain of $f$ is $V$, so it doesn’t really make sense to speak of $f(r_i)$, but you can talk about $\mathbf Ar_i$, or about the vectors that the rows of $\mathbf A$.

Anyway, you can’t conclude that the set $\{\mathbf Ar_i\}$ forms a basis for $\mathbb R^M$, or even that it spans $\mathbb R^m$, any more than you can say that the columns of $\mathbf A$ span $\mathbb R^m$. What you can say is that the images of the rows of $\mathbf A$ span the same space as do its columns. The latter span a subspace or $\mathbb R^m$ called, somewhat unimaginatively, the column space of $\mathbf A$. This subspace corresponds to the image of $t$ under the isomorphism created when you choose a basis for $W$.

In a similar way, the rows of $\mathbf A$ span a subspace of $\mathbb R^n$ called the row space of $\mathbf A$. In elementary presentations, this space is described as the orthogonal complement of the null space of $\mathbf A$, which corresponds to the kernel of $f$, but that description implicitly introduces an inner product on $\mathbb R^m$. This can be avoided by considering the dual spaces $W^*$ and $V^*$.

As you’ve no doubt learned, these are spaces of functionals on $W$ and $V$, respectively, and the adjoint $f^*:W^*\to V^*$ of a linear map $f:V\to W$ is a linear map such that for all $\mathbf v\in V$ and $\mathbf\beta\in W^*$, $(f^*\mathbf\beta)\mathbf v = \mathbf\beta(f\mathbf v)$. We say that an element $\mathbf\alpha$ of $V^*$ annihilates a subspace $U$ of $V$ if $\mathbf\alpha(U)=\{0\}$. The set of all $\alpha$ that annihilate $U$ is a subspace of $V^*$ called the annihilator of $U$, denoted $U^0$ (or $U^\perp$, for reasons that should become clear).

There is a key relation among four principal subspaces of these vector spaces: $$\operatorname{im}(f^*) = (\ker f)^0 \\ \ker(f^*) = (\operatorname{im}f)^0.$$ Given a matrix representation $\mathbf A$ of the linear transformation $f$, there is a choice of bases for $W^*$ and $V^*$ (the dual bases for the ones chosen for $V$ and $W$) in which the representation of $f^*$ is $\mathbf A^T$. The above relationship among images and kernels of $f$ and $f^*$ then translate into relationships among the null spaces and column spaces of $\mathbf A$ and $\mathbf A^T$. Moreover, in this coordinate representation, $\mathbf v^*\mathbf w$ becomes the dot product of the coordinate representations of $\mathbf v$ and $\mathbf w$, which lets you identitfy the annihilator of a subspace of $V$ or $W$ with the orthogonal complement of the corresponding subspace of $\mathbb R^m$ or $\mathbb R^n$. In particular, we have $$\operatorname{col}(\mathbf A^T) = (\operatorname{nul}\mathbf A)^\perp \\ \operatorname{nul}(\mathbf A^T) = (\operatorname{col}\mathbf A)^\perp,$$ and, of course, $\operatorname{row}(\mathbf A)=\operatorname{col}(\mathbf A^T)$.

0
On

One important aspect of rows are when you consider the homogeneous system $AX=0$ where A is the matrix of coefficients.

The solution vector $X$ is orthogonal to each and every row of the matrix $A$. That is $X$ is in the orthogonal complement of the row space.

The other aspect of the row space is the it is the image space of the transformation $T(X)=AX$

Thus a system $AX=B$ has a solution if and only if $B$ is in the image space of T.