Can I decompose this matrix into separate parts?

120 Views Asked by At

Right now I have a matrix of the form

\begin{pmatrix} 0 & (a^\top b) ~ b ~c^\top \\ c ~ b^\top (b^\top a) & 0, \end{pmatrix}

where $a$ and $b$ are vectors of the same dimension, and $c$ is another vector of arbitrary dimension. The bottom left submatrix is the transpose of the upper right submatrix. Is there any way to disentangle the variables here so that I get something, for example, of the following form $$M_1(a) \, M_2(b) \, M_3(c)$$ where $M_1(a)$ is a matrix that just contains $a$, etc.

2

There are 2 best solutions below

0
On BEST ANSWER

While I don't see a way to decompose this into the form you suggest I was able to decompose the matrix to a similar form.

First, suppose $a$ and $b$ are $n$-dimensional column vectors and $c$ is an $m$-dimensional column vector. We can reorganize the expressions to

\begin{equation} \begin{bmatrix} 0_{n x n} & (a^\top b)b c^\top \\ c b^\top (b^\top a) & 0_{m x m} \end{bmatrix} = \begin{bmatrix} 0_{n x n} & b a^Tbc^T \\ c b^T a b^T & 0_{m x m} \end{bmatrix} \end{equation}

which can be decomposed into

\begin{equation} \begin{bmatrix} I_{n \times n} & 0_{n \times 1} \\ 0_{m \times n} & c \end{bmatrix} \begin{bmatrix} 0_{n \times n} & b \\ b^\top & 0 \end{bmatrix} \begin{bmatrix} 0_{n \times n} & a \\ a^\top & 0 \end{bmatrix} \begin{bmatrix} 0_{n \times n} & b \\ b^\top & 0 \end{bmatrix} \begin{bmatrix} I_{n \times n} & 0_{n \times m} \\ 0_{1\times n} & c^\top \end{bmatrix}. \end{equation}

3
On

While this isn't a fully rigorous non-existence proof, this argument suggests that no such decomposition exists for $n \times n$ matrices for $n>2$.

It is easy to check that $$\begin{bmatrix} 0 & a^\intercal b b c^\intercal\\ cb^\intercal b^\intercal a & 0 \end{bmatrix} = \begin{bmatrix} 0 & a^\intercal \\ c& 0 \end{bmatrix}\begin{bmatrix} 0 & b^\intercal b^\intercal\\ b b & 0 \end{bmatrix}\begin{bmatrix} 0 & c^\intercal\\ a & 0 \end{bmatrix}.$$ Now suppose that the matrix $\begin{bmatrix} 0 & c^\intercal\\ a & 0 \end{bmatrix}$ has some decomposition $M(a)*M(c)$. I.e., for any two $n\times n$ matrices $a$ and $c$, we can find scalars $j_1,\dots, j_8, k_1, \dots, k_8$ such that for the matrices $$M(a)=\begin{bmatrix} j_1a+j_2a^\intercal & j_3a+j_4a^\intercal\\ j_5a+j_6a^\intercal & j_7a+j_8a^\intercal \end{bmatrix}, \\ M(c)=\begin{bmatrix} k_1c+k_2c^\intercal & k_3c+k_4c^\intercal\\ k_5c+k_6c^\intercal & k_7c+k_8c^\intercal \end{bmatrix},$$ we have $M(a)*M(c) = \begin{bmatrix} 0 & c^\intercal\\ a & 0 \end{bmatrix}.$ Then, $$(j_1k_3+j_3k_7)ac+(j_1k_4+j_3k_8)ac^\intercal+(j_2k_3+j_4k_7)a^\intercal c+(j_2k_4+j_4k_8)a^\intercal c^\intercal = c^\intercal.$$ Now, let $V$ be the vector space of $n \times n$ matrices, and observe that $\dim(\text{span}\{ac, ac^\intercal, a^\intercal c, a^\intercal c^\intercal\}) \leq 4$ and $\dim(V)=n^2.$ Since $a$ and $c$ are arbitrary, the product $M(a)*M(c) = \begin{bmatrix} 0 & c^\intercal\\ a & 0 \end{bmatrix}$ does not exist in general for $n>2$, since $n^2>4$.

If $n=2$ and $\dim(\text{span}\{ac, ac^\intercal, a^\intercal c, a^\intercal c^\intercal\}) = 4$, there might be a solution for the two equations $$(j_1k_3+j_3k_7)ac+(j_1k_4+j_3k_8)ac^\intercal+(j_2k_3+j_4k_7)a^\intercal c+(j_2k_4+j_4k_8)a^\intercal c^\intercal = c^\intercal, \\ (j_5k_1+j_5k_5)ac+(j_5k_2+j_7k_6)ac^\intercal+(j_6k_1+j_8k_5)a^\intercal c+(j_6k_2+j_8k_6)a^\intercal c^\intercal = a,$$ over the variables $j_1,\dots, j_8, k_1, \dots, k_8$, but I haven't checked :)

P.S. The reason that this doesn't work in general is because matrix multiplication over $V$ forms a non-abelian group: in general, $ab \neq ba$. The more rigorous nonexistence proof for $n>2$ would probably be a group-theoretic argument. Maybe someone else will chime in!