Solving $A*X*B = X*C$ with $X$ the only unknown

63 Views Asked by At

I've formulated a problem as above, and wondering if a solution is actually possible.

Dimensionality:

  • $A$ is $N \times N$ (known)
  • $B$ and $C$ are $N \times 1$ (known)
  • $X$ is $N \times N$ (unknown)

Now obviously the problem is very underdetermined, and I am also expecting the solution would involve null space, but I am not worried about this for now as I can add an arbitrary number of observations.

But before I do that, I am just wondering if it is possible in principle to disentangle X from the sandwich it finds itself in?

2

There are 2 best solutions below

2
On

Here is a partial answer:

  1. If $C=\lambda B$ then $XB$ is an eigenvector of $A$ of eigenvalue $\lambda$, you can compute all of them and then define any $X$ that sends $B$ to such eigenvector.

  2. Otherwise they are linearly independent, so you can extend them to a base of the whole space. Let $v$ be any $N\times 1$ vector and $w=Av$. Then you can find $X$ such that $XB=v$ and $XC=w$ by defining a linear transformation on the basis we created before (lots of liberty here), any such $X$ is going to be a solution.

0
On

$\def\L{\left}\def\R{\right}\def\v{\operatorname{vec}}$Use a Kronecker product to flatten the unknown matrix into a vector $x=\v(X),\,$ i.e. $$\eqalign{ \v(AXb) - \v(I_nXc) &= 0 \\ \L((b^T\otimes A)-(c^T\otimes I_n)\R)x &= 0 \\ Mx &= 0 \\ }$$ Therefore $x$ lies in the nullspace of $M$ and can be calculated in terms of an arbitrary vector $w$ and the pseudoinverse $M^+$ $$\eqalign{ x &= \L(I_{n^2}-M^+M\R)w \\ X &= {\rm reshape}(x,n,n) }$$ This isn't a unique solution but a whole family of solutions determined by the $w$ vector.