I am stuck on the following question from Shifrin's book Multivariable Mathematics for a while.
Let $A$ be an arbitrary $m \times n$ matrix. Can there be vectors $b_1,b_2,b_3 \in R^m$, such that $A \mathbf{x} = \mathbf{b_1}$ has no solution, $A \mathbf{x} = \mathbf{b_2}$ has exactly one solution and $A \mathbf{x} = \mathbf{b_3}$ has infinitely many solutions?
(The solution should be given without using any advanced concepts like determinants, inverses, eigenvalues and not even linear independence. It should be possible to use only simple matrix algebra, and the concepts of rank, span, linear combination, standard dot product etc. For an example see my attempt below.)
My feeling is that this is not possible but I'm not convinced yet. I've written what I have so far below but they are not watertight yet. I thought of the solutions of a linear system $A \mathbf{x} = \mathbf{b}$ as giving the coefficients $x_i$ of the linear combination of the column vectors of $A$ that gives $\mathbf{b}$. But I know that one can think of the solutions also as being the intersection of hyperplanes $\mathbf{A}_i \cdot \mathbf{x} = b_i$, where $\mathbf{A}_i$ are the row vectors of $A$, and $b_i$ is the $i$-th component of $\mathbf{b}$. If that would be a better approach to think of the problem, please point it out.
We know for a linear system $A \mathbf{x} = \mathbf{b}$ to have a solution, $\mathbf{b} \in \text{span}\{\mathbf{a_1},\ldots,\mathbf{a_n}\}$, i.e. $\mathbf{b}$ needs to be a linear combination of the column vectors of $A$.
Let's choose $\mathbf{b_1}$ to be outside the span, so that $A \mathbf{x} = \mathbf{b_1}$ has no solution is satisfied. For $A \mathbf{x} = \mathbf{b_2}$ to have exactly one solution, we need $\mathbf{b_2} \in \text{span}\{\mathbf{a_1},\ldots,\mathbf{a_n}\}$, but also we need that there is only exactly one linear combination that gives $\mathbf{b}$. Assuming this is satisfied and we checked the second point, then it follows that $A \mathbf{x} = \mathbf{b_3}$ can't have infinitely many solutions anymore. This is because we need $\mathbf{b_3} \in \text{span}\{\mathbf{a_1},\ldots,\mathbf{a_n}\}$ (so there is at least a solution to $A \mathbf{x} = \mathbf{b_3}$) but infinitely many ways to build a linear combination. This is not possible since then we would have had also infinitely many ways to build a linear combination for $\mathbf{b_2}$, because they are both in the same subspace $V = \text{span}\{\mathbf{a_1},\ldots,\mathbf{a_n}\}$.
Let’s talk about linear maps geometrically. I’m going to use linear maps instead of matrices because I don’t want to talk about grids of numbers and I don't really care about bases.
Now firstly it’s certainly possible for there to be exactly one solution: take $A=I$ and $b$ to be your favourite vector. Now let’s consider $V$ having a basis of $e,f$ and $\alpha : V\to V$ the linear map corresponding to $A.$ finally set $\alpha e=e$ and $\alpha f = 0$. Then there are no solutions to $\alpha x = f$ and $x = e + t f$ is a solution to $\alpha x = e$ for any scalar $t$. But is it also possible to have some $b$ for which there is exactly one solution?
Suppose we have some $\alpha,b_1,b_2$ such that there is no solution to $\alpha x = b_1$ and there is an infinite set $S$ of solutions to $\alpha x = b_2$. Now pick some $s\in S$ and we see that actually $S= s + W$ where $W$ is a vector space of positive dimension. This is because iff $s'\in S$ then $\alpha(s-s')= b_2-b_2=0$ so $s-s'\in W$ with $W$ being the null space of $\alpha.$
Now let’s assume that we have some $b_3$ such that there is a single solution to $\alpha x = b_3.$ Let’s call that solution $x.$ so now pick $0\ne w\in W$ (and recall that there aware infinitely many $w$ we can choose as $W$ is a vector space over $\Bbb R$ with dimension at least 1) and then what is $\alpha (x+w)?$ well we can expand as $\alpha x + \alpha w = b_3 + 0 = b_3$ so the solution $x$ is not unique!