The most important use of matrices lies in representing linear transformations on a vector space...
A matrix represents the transformation which takes the first basis vector into first column of the matrix, second basis vector into the second column of the matrix, j-th basis vector into j-th column
The above applies to left multiplication by a matrix $Ax$. Is there an analog for right multiplication $yA$ (with $y$ being a row vector)?
Can we use the above to see $AB$ as a transformation of $B$ by $A$ and as a transformation of $A$ by $B$?
(See also this post which is related, but not as direct.)
Update
Thank you MJD for comments.
I believe the correct answer is: $yB$ is the transformation which takes the first row basis vector into the first row of $B$, the second row basis vector into the second row of $B$, and, in general, viewing row vector $y$ as a linear combination of row vectors, takes $y$ to the corresponding linear combination of the rows of $B$.
A row basis would be the transpose of the standard basis; I'm surprised I can't find any reference for this idea.
This implies that $AB$ is both:
- A transform of $B$ by $A$, replacing each col of $B$ with the transform defined by $A$ on it: $B_{:j} \mapsto A(B_{:j})$
- And, a transform of $A$ by $B$, replacing each row of $A$ with the transform defined by $B$ on it: $A_{i:} \mapsto B(A_{i:})$
The miraculous thing is that for any two matrices, in any space, if $AB$ is defined, both transforms equal to the same thing. How do we explain that?
Is the above correct?
Suppose $A$ is an $m \times n$ matrixover a field $F$. Let $U$ be the vector space of all $p \times m$ matrices over $F$.Let $V$ be the vector space of all $p \times n$ matrices over $F$.Then the map $U \to V$ defined by $C \mapsto CA, C \in U$ is a linear transformation from $U$ to $V$. In particular, if $p=1$, we have the linear transformation from the vector space of row vectors with $p$ entries to the vector space of row vectors with $n$ entries.