I am attempting to use python to programmatically calculate the values in a dataset given the following equations:
\begin{align} f_{ij} = \frac{g_{ij}}{\sum_{k} g_{ik}} \\\\ g_{ij} = \frac{f_{ij}}{\sum_k f_{kj}}\\\\ \sum_jf_{ij} = 1 \end{align}
Where i, j, and k represent individual elements in the same finite set.
I am trying to recreate a simulation described in a research paper, and at this point the author describes how this system can be solved 'self-consistently.' I don't understand what this means, but it seems that the equations must be solved probabilistically. The values f_ij and g_ij represent scores describing how i and j interact with each other, and can be thought of as edge weights in a graph.
What is the best way to go about solving for each individual value of f and g using python?
I understand that this is not a specific question, but I don't know where to begin, and have had trouble finding an explanation for what it means to programmatically calculate a set of 'self-consistent' equations.
Let $F, G$ respectively be the matrices of the $f_{ij}$ and $g_{ij}$ values. Then:
This means that in particular, there are infinitely many solutions where $F=G$ and both the rows and the columns of this matrix sum to $1$. These are called doubly stochastic matrices.
These are the only solutions in which all entries of $F$ and $G$ are positive. To see this, let $r_i = \frac1{\sum_k g_{ik}}$: the scaling factor applied to the $i^{\text{th}}$ row of $G$ to get $F$. Let $c_j = \frac1{\sum_k f_{kj}}$: the scaling factor applied to the $j^{\text{th}}$ column of $F$ to get $G$. Then we have $g_{ij} = r_i c_j g_{ij}$, so if all the entries of $G$ are positive, we conclude $r_ic_j = 1$ for all $i,j$. This means in particular that $r_1 = r_2 = \dots = r_n$ and $c_1 = c_2 = \dots = c_n$; therefore $F = rG$ and $G = cF$ for some constants $r,c$. But if all the rows of $F$ sum to $1$ and all the columns of $G$ sum to $1$, then all the entries of both $F$ and of $G$ sum to $n$; therefore $r=c=1$ and $F=G$.
There are exceptional cases where $F$ and $G$ have zero entries and we don't get $F=G$. For example: $$ F = \begin{bmatrix}1 & 0 & 0 \\ 1 & 0 & 0 \\ 0 & 1/2 & 1/2\end{bmatrix}\qquad G = \begin{bmatrix}1/2 & 0 & 0 \\ 1/2 & 0 & 0 \\ 0 & 1 & 1 \end{bmatrix}. $$ Essentially, such solutions correspond to cases where $F$ and $G$ have a block structure: there are two or more rectangular solutions, with zeroes preventing them from interacting. If $F$ and $G$ are rectangular and not square matrices, we don't get $F=G$; rather, the all-positive case is that when $F$ and $G$ are both $m \times n$, we have $F = \frac mn G$, the rows of $F$ sum to $1$, and the columns sum to $\frac nm$.