I am new to spectral graph theory, so I apologize if there are any mistakes in my writing. If I make an error, please kindly point it out and guide me in the right direction.
Now let me clearly state the problem.
Consider a graph $G = (V, E)$. We randomly sample nodes from this graph and keep track of the count $n_i$ which represents the number of times $i^\text{th}$ node is sampled. We define the edge weights to capture the similarity between the nodes. Specifically, when a node $i$ is selected, the count $n_i$ increases by 1, and this change also affects the nodes adjacent to $i$ based on the weight of the edges between nodes. To quantify this effect, we introduce a quantity called $n_i^\text{eff.}$ (effective count of $i^\text{th}$ node) which is given by
$$n_i^\text{eff.} = \left( \left[ N + (\rho L) \right]^{-1} \right)_{ii}^{-1},$$
where $N$ is a diagonal matrix of ${n}_{t=1}^n$ and $L$ is the Laplacian matrix of the graph. To find the Laplacian, we used weighted degrees.
Now, my question is: How can one arrive at this equation? I have gained an understanding of intuition by working through graphs with 2, 3, and 4 nodes. Specifically, I would like to know why we add the weighted Laplacian to $N$. Why do we take the inverse of this sum? And, ultimately, why does this equation work?
I am asking this question to understand the paper - https://arxiv.org/abs/2108.01152
The equation is given on page 6 of the paper.