On normalizing/transforming a matrix (arising in a graph generation model)

28 Views Asked by At

Assume you are given a symmetric matrix $\tilde{W}$ with the diagonal elements being some constant $C$ and off-diagonal elements being $1$. You are additional given a non-negative (probability) vector $\pi \in \mathbb{R}^K$ with $\sum_i \pi_i = 1$ which you should use to transform $\tilde{W}$ into a symmteric matrix $W$ having the property $\sum_{r}\sum_{s} w_{rs} = 1$.

A paper I am reading does this using $W = \frac{\pi \pi^T}{\pi^T \tilde{W} \pi} \circ \tilde{W}$ (and not explaining it) with $\circ$ being the Hadamard product. However, I am failing to understand how to come up with this transformation / normalization.

Does anyone know how to come up with this?

Optional: Longer Problem Description including Context

Given a graph whose nodes should be grouped in $K$ categories based on the proportions specified using the vector $\pi \in \mathbb{R}^K$ with $\sum_i \pi_i = 1$. Furthermore an expected number of edges $|E|$ is given. We are interested in generating a graph, whose expected number of edges between two categories $r$ and $s$ can be described by $w_{rs}$ which we collect in a matrix $W \in \mathbb{R}^{K \times K}$. Furthermore, we assume $w_{rs}=w_{sr}$. It holds that $\sum_{r}\sum_{s} w_{rs} = |E|$.

Now assume, we are only given $\pi$, $|E|$, $p$ which represents the probability that an edge connects two nodes of the same category and $q$ which is the probability that an edge connects two nodes of different category. These probabilities are not dependent on the concrete classes. Hence, $w_{rs}$ is somehow proportional to $q$ if $r\ne s$ and $p$ if $r = s$. From $p$ and $q$ a matrix $\tilde{W} \in \mathbb{R}^{K \times K}$ is constructed whose elements $\tilde{w}_{rs}$ are 1 if $r \ne s$ and $\frac{p}{q}$ if $r=s$.

As written above, I am reading a paper which states that $\tilde{W}$ can be used to construct $W$ through $W = \frac{\pi \pi^T}{\pi^T \tilde{W} \pi} |E| \circ \tilde{W}$ and I am failing to understand how to come up with this transformation / normalization.

1

There are 1 best solutions below

0
On

I could solve the problem by recognizing that $\pi \pi^T$ is multiplied element-wise with $\tilde{W}$ and not through matrix multiplication!

Now, one can derive multiple things:

  • $\pi \pi^T$ defines a probability distribution over $K \times K$ as $\sum_i \sum_j (\pi \pi^T)_{ij} = \sum_i \sum_j \pi_i \pi_j = \sum_i (\pi_i \sum_j \pi_j) = 1$. In the graph context, it can be interpreted as given a fully connected graph, $(\pi \pi^T)_{ij}$ describes the proportions of edges connecting nodes from category $i$ to category $j$. In other words, out of all possible node-pairs, it gives the proportion of node-pairs connecting nodes from category $i$ to category $j$.
  • $(\pi \pi^T \circ \tilde{W})_{ij} = \pi_i \pi_j \tilde{w}_{ij}$. This can be interpreted as scaling the probability $\pi_i \pi_j$ by an arbitrary factor $\tilde{w}_{ij}$ (not preserving that the new matrix elements sum to $1$).
  • $\pi^T \tilde{W} \pi = \sum_i \pi_i \sum_j \pi_j \tilde{w}_{ij}$. One can recognize that this just corresponds to the sum over all elements of the matrix $\pi \pi^T \circ \tilde{W}$. This can be seen as $\sum_i \sum_j (\pi \pi^T \circ \tilde{W})_{ij} = \sum_i \sum_j \pi_i \pi_j \tilde{w}_{ij} = \sum_i \pi_i \sum_j \pi_j \tilde{w}_{ij} = \pi^T \tilde{W} \pi$.

Therefore, summing up all elements in $W = \frac{\pi \pi^T}{\pi^T \tilde{W} \pi} \circ \tilde{W}$ results in $1$ and hence, $W$ defines a probability distribution over $K \times K$. Therefore, $w_{ij}$ defines the probability that a randomly selected edge connects a node from category $i$ to category $j$.