Is it possible in general, if $A,B,C,X,Y$ are square and of the same dimensions? If so, does it generalize to non-square matrices (using a pseudoinverse)? I'm doing some curve fitting in which I have to estimate the two independent polarizations of a signal given the data from multiple detectors and the scalar response function for each polarization (LIGO data analysis). Being able to rewrite the equation above in general would make it possible to express fitted values as seen from multiple detectors (Y) as an explicit function of X, the data seen at each detector, and would enable a generalized cross-validation calculation to choose a regularization parameter.
Thanks.
No.
Starting with the matrix equation $AX=YB$, assuming you can invert $B$, you can write $Y = AXB^{-1}$. Now, each element of $Y$ is a linear combination of all elements of $X$.
Now look at $Y=CX$. Each element of $Y$ is a linear combination only of the corresponding column of $X$ (but not the row).
The matrix equation does related the elements of $X$ and $Y$ linearly, however, so you can write $\mathrm{vec}(Y) = C\,\mathrm{vec}(X)$ where you write out each matrix as a strung out vector. This means that $C$ is a $n^2 \times n^2$ matrix if your original matrices were $n\times n$.