Why does every $ m \times n$ matrix of rank $r$ reduce to $(m \times r)$ times $(r \times n)$?

489 Views Asked by At

How can I prove the following statement?

Every $m \times n$ matrix of rank $r$ reduces to $(m \times r)$ times $(r \times n)$:

$A = $ (pivot columns of $A$) (first $r$ rows of $R$) = (COL)(ROW).

[ Source: Gilbert Strang, Introduction to Linear Algebra, question $56$, section $3.2$. ]

I think that the $R$ is reduced row echelon form. I believe there is a brief elegant proof.

4

There are 4 best solutions below

2
On

Let $\{v_1,\dotsc, v_r\}$ be a basis of $\operatorname{Col}(A)$ where $A$ is $m\times n$. Put these basis vectors into the columns of a matrix $V$, so that $$ V=\left[\begin{array}{ccc}v_1 & \cdots & v_r\end{array}\right] $$ Then, for every $b\in\operatorname{Col}(A)$ there exists a $x_b\in\Bbb R^r$ such that $Vx_b=b$.

Now, let $\{a_1,\dotsc, a_n\}$ be the columns of $A$ so that $$ A=\left[\begin{array}{ccc}a_1 & \cdots & a_r\end{array}\right] $$ Each column $a_k$ of $A$ is in $\operatorname{Col}(A)$. So, let $$ W=\left[\begin{array}{ccc}x_{a_1} & \cdots & x_{a_r}\end{array}\right] $$ What happens when we compute $VW$?

0
On

Suppose $\mathbf{A}=[\mathbf{p_1}, \mathbf{f_2}, \cdots, \mathbf{p_n}]$ where $\mathbf{p_i}$ is the column of $\mathbf{A}$ indexed $i$ and it is pivot column; while $\mathbf{f_i}$ is free column with similar notation. Therefore, $\mathbf{L}=[\mathbf{p_1}, \cdots, \mathbf{p_n}]$ have columns of pivot columns of $\mathbf{A}$. Let $\mathbf{R}=[\mathbf{p_1^R}, \mathbf{f_2^R}, \cdots, \mathbf{p_n^R}]$ be the reduced row echelon form of $\mathbf{A}$. $\mathbf{R^c}$ be the first $r$ rows of $\mathbf{R}$.

Matrix $\mathbf{A}$ is of $m$ by $n$ with rank $r$. Because $\mathbf{R}$ is $rref(\mathbf{A})$, the last $r$ rows of $R$ are null row vectors. Thus, $\mathbf{L}\mathbf{R^c}=[\mathbf{p_1}, \cdots, \mathbf{p_n}, \mathbf{f_2}, \cdots]\mathbf{R}$ where the last $n-r$ columns of $[\mathbf{p_1}, \cdots, \mathbf{p_n}, \mathbf{f_2}, \cdots]$ are free columns and first $n$ columns are pivot columns.

Now we need to prove $\mathbf{A}=[\mathbf{p_1}, \cdots, \mathbf{p_n}, \mathbf{f_2}, \cdots]R$. Every entry of $\mathbf{p_i^R}$ is 0 except 1 at the $i^{th}$ row, $[\mathbf{p_1}, \cdots, \mathbf{p_n}, \mathbf{f_2}, \cdots] \mathbf{p_i^R} = \mathbf{p_i}$. The pivot columns of $\mathbf{A}$ remain. For $\mathbf{f_j^R}=[c_0, \cdots, c_n, 0, \cdots]^{T}$, we have $c_0\mathbf{p_0^R}+\cdots+c_n\mathbf{p_n^R}=\mathbf{f_j^R}$. Because $\mathbf{R}$ has the same row space and nullspace of $\mathbf{A}$, $c_0\mathbf{p_0}+\cdots+c_n\mathbf{p_n}=\mathbf{f_j}$, i.e. $[\mathbf{p_1}, \cdots, \mathbf{p_n}, \mathbf{f_2}, \cdots]\mathbf{f_j^R}=\mathbf{f_j}$.


Welcome to alternative proofs.

1
On

Let $m\times n$ matrix $A=\left[a_1, \ldots, a_r, a_{r+1},\ldots,a_n\right]$, without loss of generality (I guess you can switch the columns), let first $r$ columns are its pivot columns, denote as $P$, and the left $n-r$ columns are free columns, denote as $F$.

Its reduced echelon matrix $R=\left[\begin{array}{c c} I_{r\times r}&F^0_{r\times(n-r)} \\ 0_{(m-r)\times r}& 0_{(m-r)\times (n-r)} \end{array}\right] $.

Since $R$ is derived from row eliminations, it’s equivalent to left multiply a series of elementary matrices, which are $m\times m$ and invertible.

So we have

$A=ER$, $E$ is an invertible $m\times m$ matrix.

Rewrite the equation above as $[P,F] = E\left[\begin{array}{c c} I & F^0\\ 0 & 0 \end{array}\right] $,

so, $P=E\left[\begin{array}{c} I\\ 0\end{array}\right],F=E\left[\begin{array}{c} F^0\\ 0\end{array}\right]$.

We want to prove $A=P[I,F^0]$, that is, $A$ equals the pivot columns of $A$ times of first $r$ rows of $R$.

Since $P[I,F^0]=[PI,PF^0]=[P, PF^0]$, the first $r$ columns are pivot columns themselves. left we want to show $PF^0=F$, the free columns.

Since $P=E\left[\begin{array}{c} I\\ 0\end{array}\right]$, then

$PF^0 = E\left[\begin{array}{c} I\\ 0\end{array}\right]F^0=E\left[\begin{array}{c} F^0\\ 0\end{array}\right]=F$. It's indeed the free columns. Proved.

0
On

I provide a similar proof as that by Brian.

Denote R as the Reduced Row Echelon Form of A, and E as the matrix that stands for the row operations (i.e. left multiplication) on A that produces A, then we have $EA=R$.

Further notice that the selection of pivotal columns can be expressed as a right multiplication on A, denoted by a n-by-r matrix $J$,

while the slection of the fisrt r rows of R can be expressed as a left multiplication on R, denoted by a r-by-m matrix $K=\begin{bmatrix}I_{r,r} \;\;O_{r,(n-r)}\end{bmatrix}$,

What we are interested to show is

$\begin{align} AJKR&=A\\ \Leftarrow EAJKR &=EA=R(\text{because E is invertible}) \\ \Leftarrow RJKR &=R \end{align}$

And it is not too hard to see the last equation hold true, as each pivotal column of R has only one non-zero element, and the left multiplication of an m-by-r matrix $RJ$ can be considered as appending (m-r) rows of zeros into a r-rows matrix.:

$RJ=\begin{bmatrix}I_{r,r}\\ O_{(m-r),r}\end{bmatrix}$,

We have $RJK=\begin{bmatrix}I_{r,r} \;\;O_{r,(n-r)}\\ O_{(m-r),r}\;O_{(m-r),(n-r)}\end{bmatrix}=\begin{bmatrix}K\\O_{(m-r),n}\end{bmatrix}$, which selects the first r rows while eliminating the rest (m-r) rows to be zeros.

Lastly, $RJKR=\begin{bmatrix}R_{(r,n)}\\O_{(m-r),n}\end{bmatrix}=R$ because the last (m-r) rows of R is zero by itself.