A connected d-regular graph $G = (V,E,d)$ is given, when each node has exactly d outlinks reaching the other nodes. Also, a function $l$ maps nodes with positive and negative labels, where $V_+$ are the nodes having positive label (+1) and $V_-$ are the ones having negative label (-1). Assuming to have a random walk, where nodes with negative labels are considered as absorbing we can generate the fundamental matrix $\mathbf F$, where $\mathbf F= (\mathbf I - A_{++})^{−1}$, where $A_{++}$ is the matrix given by the transient (and positive) nodes.
I'm trying to understand which kind of properties can be defined over the matrix $\mathbf F$. In general, I would to understand if there is any kind of mapping with the adj matrix, like inequalities based on graph topology (e.g. neighbors or not neighbors may present higher or lower value). I would like to find out any nice property that can link edges distributions with distribution of values over $\mathbf F$.
I think the most important property from a practical/computational point of view is that $A_{++}$ and $F$ commute and share eigenvectors. The Perron-Frobenius eigenvectors carry a lot of information about the network. You can write $$ A_{++} = \sum_j \lambda_j v_j u^T_j $$ where $v_j$ and $u^T_j$ are the right and left eigenvectors associated with $\lambda_j$. Then $$ F = \sum_j \frac{1}{1 - \lambda_j} v_j u^T_j $$
For networks which do not have a lot of community structure, the Perron-Frobenius eigenvector is much bigger than the others, which means that in both of these expressions you can write $$ A_{++} \approx \lambda_1 v_1 u^T_1 $$ and $$ F \approx \frac{1}{1 - \lambda_1} v_1 u^T_1 $$ reducing everything to rank-1 matrices. This can be incredibly helpful as an approximation. See a couple of my other answers here and here for some more comments on the Perron-Frobenius eigenvector, random networks, and community structure. Note that even if there is community structure, then you may end up with two or three eigenvalues dominating the others, in which case you just need to retain two or three terms in the approximation.
Unfortunately, because $A_{++}$ only represents a subset of the whole network, then we can't say anything nice about the eigenvectors. (For the full transition matrix of any Markov process, then $1^T$ has to be a left eigenvector, but this is not true of $A_{++}$.) So, in a computational setting, you would have to calculate them (using power iteration, the Lanczos method, or full diagonalization) in general.
However, there are additional simplifying assumptions that would allow you to approximate the Perron-Frobenius eigenvalue and eigenvectors using just degree information of the nodes. This paper explores several such approximations.
Your question doesn't have enough context for me to know whether this is the kind of answer you are looking for, or whether you are interested in a more rigorous of graph-theoretical invaraints. But I hope it is useful.