I am reading the following section in the Deep Learning book by Goodfellow, Bengio and Courville.
I have some questions.
If $A\in\mathbb{R}^{m\times n}$ and $Ax=y$, then we can have left inverse $B$ such that $x=By$ only when the nullspace of the matrix $A$ is $\{0\}$, because if $Ax=0=y$ for some $x\neq 0$ then there is no $B$ such that $0\neq x=By=0$. In other words, the dimension of rowspace, and hence the rank of $A$, must be $n$.
Thus, if $A$ is taller than it is wide, $m>n$, then it is possible for the left-inverse to exist. On the other hand, if $A$ is wider than it is tall, $m<n$, then the rank can be at most $m$, which means that $A$ cannot have left-inverse.
However, in the screen-shot I have attached, it shows the exact opposite: If $A$ is taller than it is wide, then it is possible for this equation to have no solution. If $A$ is wider than it is tall, then there could be multiple possible solutions.
Where I have gone wrong in my reasoning?

Let us suppose that $A$ is taller than wide. Then, yes, it may have a left inverse. And the equation $Ax=y$ may have no solutions. You are acting as if there is a contradiction here, but where is it? If $A$ is the null matrix, then $A$ has no left inverse and, if $y\neq0$, the equation $Ax=y$ has no solutions. On the other hand, if $A$ has a single column and $A\neq0$, then $A$ has a left inverse (infinitely many, in fact) and the equation $Ax=y$ always has a solution (again, infinitely many, in fact).