Min. Number of Sparse Matrix Elements to preserve Matrix Properties under Permutations

83 Views Asked by At

Given matrices $S \in \mathbb{R}^{G \times K}$, $Q\in \mathbb{R}^{K \times K}$ and $T \in \mathbb{R}^{G \times K}$ with $T = S \cdot Q$, I would like to find the minimum number of sparse elements in $S$ such that columns in $T$ are only scaled and/or permuted version of columns in $S$.
Hence, $Q$ should be of the form
(A) $Q = diag(q)$ or
(B) $Q = diag(q) \cdot P_{\pi}$
with $q \in \mathbb{R}^K$ and $P_{\pi}$ being a permutation matrix.

After playing a bit around, my thoughts are:
I found that if one column contains at least K-1 zeros, another one K-2 zeros, ..., and one column at least 1 zero, (e.g., $S$ is a lower-triangular matrix (LTM)), then $Q$ has to be a diagonal matrix. Hence, having $\frac{1}{2} K (K-1)$ constraints (in form of 0s) on $S$ provides a solution to (A).

I am not able to proof it, but would state that the minimum number of zeros has to be $\frac{1}{2} K (K-1)$ and in addition $S$ needs to be of rank $K$. Is that correct? Does anybody have a reference paper for this kind of problem/proof?

1

There are 1 best solutions below

0
On

We can describe the sparsity pattern of matrix $S \in \mathbb{R}^{m \times n}$ by a bipartite graph $G = (V, E)$ with bipartition $\left( \lbrace 1, \dots, m \rbrace, \lbrace 1, \dots, n \rbrace\right)$ and weight function $w: E \rightarrow \mathbb{R}$. For example the matrix $$ S = \begin{pmatrix} x & 0 & 0\\ 0 & y & 0\\ 0 & 0 & 0\\ 0 & 0 & z \end{pmatrix} $$ can be represented by a graph $G$:

G

Scaled and permuted columns $S \cdot Q = T$ can be represented by an isomorphic graph $H$. For example, for $q = (1,2,3)^T$ and $\pi = (2, 1, 3)$, then $$ \begin{pmatrix} x & 0 & 0\\ 0 & y & 0\\ 0 & 0 & 0\\ 0 & 0 & z \end{pmatrix} \begin{pmatrix} 0 & 1 & 0\\ 2 & 0 & 0\\ 0 & 0 & 3 \end{pmatrix} = \begin{pmatrix} 0 & x & 0\\ 2y & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 3z \end{pmatrix} $$ can be represented by a graph $H$:

H

Let $|E(G)|$ be the number of edges in $G$, or equivalently, the number of non-zero weights. Sparsity of $S$ corresponds to the number of edges in the complement of $G$. Or in other words, the number of zeros in $S$ is $m \cdot n - |E(G)|$. ($9 = 4 \cdot 3 - 3$, in the above example.)


Let $H$ be a graph that represents the given matrix $T$ having some property. And let $G$ be the graph representation of matrix $S$ such that $T = S \cdot Q$ as specified in the problem statement. Then we can write the optimization problem as: \begin{align*} & \text{minimize} \ m \cdot n - |E(G)| \\ & \text{subject to:} \tag{P} \\ & \qquad G \cong H \end{align*}

Or equivalently as \begin{align*} & \text{maximize} \ |E(G)| \\ & \text{subject to:} \\ & \qquad G \cong H \end{align*}

Since $G$ and $H$ are isomorphic ($G \cong H$), the number of edges in $G$ is the same as in $H$. So the optimum is just a constant $|E(H)|$. (There's nothing to optimize, just count the zeros in $T$.)

You also stated that $m \cdot n - |E(G)| \geq n (n-1)/2$, or equivalently: $$ |E(G)| \leq m \cdot n - n(n-1)/2 $$

Under what conditions on $H$ or equivalently on corresponding matrix $T$? In other words, what constraints are missing in the optimization problem (P)?