Given matrix $A$ and vector $x$, we can view $A$ as a linear transform, so that $Ax$ means $x$ transformed by $A$. The powerful insight is that $A$ transforms basis vector $e_j$ to $A_{:j}$ (the $j^{th}$ col of $A$): we can read the transform's definition right out of the matrix encoding! $A$'s definition, as a linear transform of $x$, is therefore quite explicit.
Although $AB$ isn't a linear transform of $B$, we can still view this as $A$ "acting on" $B$, transforming each col of $B$ by this linear transform.
Now, we can also view $AB$ as $B$ acting on $A$. That is, $B$ transforms each row of $A$ by a linear transform. I would like to understand the nature of that transform (that is, of how $B$ transforms $A$).
Question: How can we describe the transformation that $B$ does to $A$?
My attempt so far is below. I'd appreciate verification, critique, or other approaches.
If the standard basis is $\{e_n\}$, let $\{t_n\}$ be the transpose of $\{e_n\}$. That is, $\{t_n\}$ is the basis of row vectors that matrix $A$ is encoded in (whereas $\{e_n\}$ is the basis of col vectors). Is there a name for this row basis? I couldn't find a reference for it. Then $B_{i:}$ (that is, the $i^{th}$ row of $B$) shows the transform of $t_i$ when multiplied by $B$. That is, $$t_i B = B_{i:}$$
$B$ transforms $A$ by replacing each row of $A$ with the linear transform thus encoded.
(Justification: $(x^\top A^\top)\top = Ax$, so if left multiplication by a matrix transforms cols of $x$ or $B$ according to the cols of $A$, then right multiplication by a matrix must transform rows of $x^\top$ or $A$ by the rows of $B$.)
Is this correct? If not, what is the correct way to understand $B$ acting on $A$ (or right multiplication by a matrix in general).