I've formulated a problem as above, and wondering if a solution is actually possible.
Dimensionality:
- $A$ is $N \times N$ (known)
- $B$ and $C$ are $N \times 1$ (known)
- $X$ is $N \times N$ (unknown)
Now obviously the problem is very underdetermined, and I am also expecting the solution would involve null space, but I am not worried about this for now as I can add an arbitrary number of observations.
But before I do that, I am just wondering if it is possible in principle to disentangle X from the sandwich it finds itself in?
Here is a partial answer:
If $C=\lambda B$ then $XB$ is an eigenvector of $A$ of eigenvalue $\lambda$, you can compute all of them and then define any $X$ that sends $B$ to such eigenvector.
Otherwise they are linearly independent, so you can extend them to a base of the whole space. Let $v$ be any $N\times 1$ vector and $w=Av$. Then you can find $X$ such that $XB=v$ and $XC=w$ by defining a linear transformation on the basis we created before (lots of liberty here), any such $X$ is going to be a solution.