How to calculate this weighted adjacency matrix?

1.5k Views Asked by At

In "Higher-order organization of complex networks", network motifs is used to transform directed graph into weighted graph so that we can get symmetric adjacency matrix. Here's how it works. There're thirteen motifs with three nodes. I'm interested in to apply $\mathcal M_{4}$ and $\mathcal M_{13}$. For $\mathcal M_{4}$, matrix-based formulation of the weighted motif adjacency matrix $W_{\mathcal M_{4}}$ is $$W_{\mathcal M_{4}}=(B \cdot B)\odot B$$ where B is the adjacency matrix of the bidirectional links of unweighted graph $G$. Formally, $B=A\odot A^T$ where A is the adjcacency matrix of $G$. However, they didn't mention the calculation method for $\mathcal M_{13}$. The question is what should be the formulation of the weighted motif adjacency matrix of $\mathcal M_{13}$?

This slideshare would be helpful to understand this paper.

example graph

1

There are 1 best solutions below

6
On BEST ANSWER

First, define a bit flip operation $\backsim$ that changes the $1$'s of a matrix to $0$'s and vice versa. There are several ways to define $\backsim A$ practically. One way is to subtract $A$ from a matrix filled with $1$'s. Another way to multiply $A$ by the square root of the appropriately sized identity matrix. We need to define another operation $*$ by $*A=[(\backsim A)\odot(\backsim A^T)]-I$ for the appropriately sized identity matrix $I$.

Then the formula you are looking for is given by

$$W_{\mathcal M_{13}}=(B\cdot B)\odot (*A)+(B\cdot (*A))\odot B+((*A)\cdot B)\odot B$$

The key to understanding this formula is to realize that the matrix operations act as logic gates such that $w_{ij}$ is only $1$ if there is a certain relationship between vertices $i$ and $j$.

Take $W_{\mathcal M_{4}}$ for example. An entry $w_{ij}$ in $(B\cdot B)$ is $1$ only if it is possible to reach vertex $j$ from vertex $i$ by taking a bidirectional edge followed by a bidirectional edge. And an entry in $(B\cdot B)\odot B$ is $1$ only if the previous condition is met and there is also a single bidirectional edge between $i$ and $j$.

Now the explanation for $W_{\mathcal M_{13}}$. enter image description here

First, imagine we are dealing with vertices that have the same relationship as $a_1$ and $a_2$. This means the corresponding entry in $(B\cdot B)\odot(*A)$ should be $1$ since there is a bidirectional path of length $2$ between these vertices and they have no edge between them.

The second and third terms are cases if the vertices satisfy the same relationships as $b_1$ and $b_2$ or $c_1$ and $c_2$ respectively. The $+$ signs between the terms function as OR gates because within each motif each case only occurs once. On page $36$ of the article you posted, you might be wondering why for some motifs there's an additional step of adding the resulting matrix to it's own transpose. This is done for certain motifs that lack special symmetry, where $v_1$ compared to $v_2$ might not satisfy any of the cases, but $v_2$ compared to $v_1$ does.

In our case we don't have to worry about that because comparing $a_2$ to $a_1$ is the same as comparing $a_1$ to $a_2$. Comparing $b_2$ to $b_1$ is like comparing $c_1$ to $c_2$. And comparing $c_2$ to $c_1$ is like comparing $b_1$ to $b_2$.

Please let me know if any of this is confusing or doesn't make sense.

Edit: I realize I originally made two big mistakes about $A$. I treated it like it was the adjacency matrix for an undirected graph and I didn't consider the fact that bit flipping $A$ would cause the diagonal entries to be $1$.

The idea behind the new operation $*$ is that $*A$ is the adjacency matrix of the complement of the graph described by $A$. The part $(\backsim A)\odot(\backsim A^T)$ acts as a NAND gate between $A$ and $A^T$ so that entries are $1$ only if the two vertices don't share an edge. We subtract $I$ to make sure the diagonal entries are $0$.

Example The matrices for the three terms of the formula are $$\begin{pmatrix} 0 & 0 & 1 & 1 & 0\\ 0 & 0 & 0 & 0 & 1\\ 1 & 0 & 0 & 1 & 0\\ 1 & 0 & 1 & 0 & 0\\ 0 & 1 & 0 & 0 & 0 \end{pmatrix} + \begin{pmatrix} 0 & 0 & 0 & 0 & 0\\ 2 & 0 & 2 & 2 & 0\\ 0 & 1 & 0 & 0 & 1\\ 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 \end{pmatrix} + \begin{pmatrix} 0 & 2 & 0 & 0 & 0\\ 0 & 0 & 1 & 0 & 0\\ 0 & 2 & 0 & 0 & 0\\ 0 & 2 & 0 & 0 & 0\\ 0 & 0 & 1 & 0 & 0 \end{pmatrix}$$ which sum up to $$\begin{pmatrix} 0 & 2 & 1 & 1 & 0\\ 2 & 0 & 3 & 2 & 1\\ 1 & 3 & 0 & 1 & 1\\ 1 & 2 & 1 & 0 & 0\\ 0 & 1 & 1 & 0 & 0 \end{pmatrix}$$ enter image description here