When write a set of simultaneous equations as a matrix-vector product of the sort:
$$\mathbf{A} \mathrm{p} = \mathrm{b}$$
(with the right dimensions)
- the left hand side $\mathbf{A} p$ can be thought as a point/vector $p$ mapped onto a new point $b$
- Equation asks which initial point(s) will end up in $b$.
The mapping can be easily seen from the linear combination of columns:
$$A_1 p_1 + A_2 p_2 + \ldots + A_n p_n$$
- If $b$ is not this span of the columns of A, then there is no solution.
Basic Questions
How to think about this when $P$ is a matrix ?
Is each point (column vector) $P_i$ similarly mapped to a new location in the $A$ space, and we search in turn for the $P_i$'s that get mapped onto $b_i$ but independently?
Would this systems of equations be actually related though ? My point being that now each row of P seems to belong to a different, unrelated set of equations
If $P$ is a matrix (and "$B$" also) every column of $P$ is transformed and its new (transformed) vector replaces the column (resulting in the column in $B$). In other words, you solve every column of $P$ for the corresponding column of $B$.