Can you determine that ANY matrix with no free-variables MUST have a leading varible in each row?

752 Views Asked by At

The question is: Let $A \in M_{n×n}(\mathbb{R})$ and let $d \in \mathbb{R}^n$. Prove that if the equation $Ax = d$ has a unique solution, then for every $b ∈ \mathbb{R}^n$, the equation $Ax = b$ has a unique solution

So, one way I thought of for proving this, is determining that A has no free variables (relying on the fact that $Ax = d$ has a unique solution) and from there claim that every row in the matrix has a leading variable. Then, according to the theorem:

(1) $Ax = b$ has a solution for all b in $\mathbb{R}^m$.

(2) The span of the columns of $A$ is all of $\mathbb{R}^m$.

(3) $A$ has a leading variable in each row.

Then I can conclude that the original statement ( If the equation $Ax = d$ has a unique solution, then for every $b ∈ \mathbb{R}^n$ , the equation $Ax = b$ has a solution) is TRUE.

My problem is I'm not sure how to prove that if a matrix has no free-variable it MUST have a leading variable in each row. Is it even possible to prove that? If so how?

1

There are 1 best solutions below

0
On

The answer is: $Ax = d$ has a unique solution if and only if $A$ is invertible. Since $A$ is invertible, for every $b, x:=A^{-1}b$ is the unique solution.


If you are curious about the iff statement...

...the easiest way to see why is to cycle through the properties of the RREF of $A$.

Let $A_{\text{inv}}$ be the product of the series of elementary row operations required to transform $A$ to $\text{RREF}(A)$.

$Ax=d$ has a unique solution if and only if $\text{RREF}(A)$ is the identity matrix. And you can see that by noting $Ax=d$ is equivalent as $\text{RREF}(A)x=A_{\text{inv}}d$. And for square matrices whose RREF is the identity, they are invertible because, by definition, $I=\text{RREF}(A)=A_{\text{inv}}\cdot A$.


If you are not familiar with elementary row operations or RREF, here is the essence.

First, you should read from the link or elsewhere what elementary row operations on a matrix look like. Next..

.. an elementary row operation is achieved by left-multifying with the matrix produced from the same elementary operation on identity.

For example, starting with

$$ \begin{equation}\tag{1} \begin{bmatrix} -1 & 1\\ 1 & 2 \end{bmatrix} \begin{bmatrix} x\\ y \end{bmatrix} =\begin{bmatrix} 0\\ 1 \end{bmatrix} \end{equation} $$

suppose you want to make row 2 add row 1. Perform that to the identity matrix, you get:

$$ \begin{bmatrix} 1 & 0\\ 0 & 1 \end{bmatrix} \longrightarrow \begin{bmatrix} 1 & 0\\ 1 & 1 \end{bmatrix} $$

The row operation on the original equation is thus

$$ \begin{bmatrix} 1 & 0\\ 1 & 1 \end{bmatrix} \begin{bmatrix} -1 & 1\\ 1 & 2 \end{bmatrix} \begin{bmatrix} x\\ y \end{bmatrix} =\begin{bmatrix} 1 & 0\\ 1 & 1 \end{bmatrix} \begin{bmatrix} 0\\ 1 \end{bmatrix} $$

which simplifies to

$$ \begin{equation}\tag{2} \begin{bmatrix} -1 & 1\\ 0 & 3 \end{bmatrix} \begin{bmatrix} x\\ y \end{bmatrix} =\begin{bmatrix} 0\\ 1 \end{bmatrix} \end{equation} $$

(1) is true if and only if (2) is true. Thus, you can solve (2) and get the answer for (1). You do that because the answer for (2) is (more) obvious.

Every matrix has a RREF because there is an algorithm that uses elementary row operations to produce RREF for every matrix.