Solving a bilinear matrix equation

421 Views Asked by At

Consider a sparse matrix $\mathbf{A} \in \mathbb{C}^{M \times N}$ that has only one non-zero element, and two given constant matrices $\mathbf{B}_1 \in \mathbb{C}^{M \times L}$ and $\mathbf{C}_1 \in \mathbb{C}^{N\times L}$, where $L \leq N \ll M$. We know that there exists $\mathbf{B}_2 \in \mathbb{C}^{M \times L}$ and $\mathbf{C}_2 \in \mathbb{C}^{N \times L}$ for which
$$\mathbf{A} = \mathbf{B}_1\mathbf{C}_2^* + \mathbf{B}_2\mathbf{C}_1^* + \mathbf{B}_2\mathbf{C}_2^* \:,$$ where $(\cdot)^*$ is the transpose conjucate operator. Any way to find closed-form expression for $\mathbf{B}_2$ and $\mathbf{C}_2$?

PS: It may be posed as an optimization problem like $$\mathrm{arg}\min_{\mathbf{B}_2,\mathbf{C}_2}~\|\mathbf{A} - \mathbf{B}_1\mathbf{C}_2^* - \mathbf{B}_2\mathbf{C}_1^* - \mathbf{B}_2\mathbf{C}_2^*\|_F^{2},$$ but it ends up being non-convex, and therefore no strong guarantee for alternating-type algorithms unless we show that a local optimal is actually a global optimal point.

3

There are 3 best solutions below

1
On

Write your equation as

$$A + B_1 C_1^* = (B_1 + B_2)(C_1 + C_2)^*$$

Now the right side has rank at most $L$, but the left side could have rank $L+1$, in which case there is no solution.

EDIT: Suppose $A + B_1 C_1^* \in \mathbb C^{M \times N}$ has rank $r \le L$. Its singular value decomposition is $U \Sigma V^*$ where $U$ and $V$ are unitary and $\Sigma$ is a diagonal matrix with $r$ positive elements (which may be taken to be the first $r$) and the rest $0$. Then we may take $B_1 + B_2$ and $C_1 + C_2$ to be the first $L$ columns of $U \Sigma$ and of $V$ respectively.

0
On

We have the following bilinear matrix equation in $\mathrm X \in \mathbb C^{m \times k}$ and $\mathrm Y \in \mathbb C^{n \times k}$

$$\mathrm X \mathrm A^* + \mathrm B \mathrm Y^* + \mathrm X \mathrm Y^* = \mathrm C$$

where $\mathrm A \in \mathbb C^{n \times k}$, $\mathrm B \in \mathbb C^{m \times k}$ and $\mathrm C \in \mathbb C^{m \times n}$ are given. All matrices are tall.

Adding $\rm B A^*$ to both sides,

$$\mathrm X \mathrm A^* + \mathrm B \mathrm Y^* + \mathrm X \mathrm Y^* + \mathrm B \mathrm A^* = \mathrm C + \mathrm B \mathrm A^*$$

Factoring the left-hand side,

$$\left( \mathrm X + \mathrm B \right) \left( \mathrm Y + \mathrm A \right)^* = \mathrm C + \mathrm B \mathrm A^*$$

If $\color{blue}{\mbox{rank} (\mathrm C + \mathrm B \mathrm A^*) =: r \leq k}$, then the matrix equation above has a solution. Computing the singular value decomposition (SVD) of $\mathrm C + \mathrm B \mathrm A^*$, we obtain

$$\rm C + B A^* = U \Sigma V^* = \begin{bmatrix} \mathrm U_1 & \mathrm U_2\end{bmatrix} \begin{bmatrix} \Sigma_1 & \mathrm O\\ \mathrm O & \mathrm O\end{bmatrix} \begin{bmatrix} \mathrm V_1^*\\ \mathrm V_2^*\end{bmatrix} = U_1 \Sigma_1 V_1^*$$

where $\Sigma_1$ is a $k \times k$ (diagonal) matrix. Thus, a solution would be

$$\rm X = U_1 \Sigma_1^{\frac 12} - B \qquad \qquad \qquad \rm Y = V_1 \Sigma_1^{\frac 12} - A$$

0
On

If you don't want to do big calculations, then you can try that follows.

Take $C_2=0$ and consider the equation $XC_1^*=A$ where the unknown $X\in M_{M,L}$. The rows of $A$ are $0$ except one which is $e_i^T$. Thus we can choose the rows of $X$ equal to $0$ except one that satisfies $[x_1,\cdots,x_L]C_1^*=e_i^T$, that is, this skinny system:

$\overline{C_1}[x_1,\cdots,x_L]^T=e_i\in \mathbb{C}^N$ where $\overline{C_1}:\mathbb{C}^L\rightarrow\mathbb{C}^N,L\leq N$; note that the defect of dimensions is $N-L$, instead of $1$ when we drop the hypothesis $C_2=0$. It remains to see if $e_i\in im(\overline{C_1})$.