Consider a directed, asymmetric Graph $G$ with its adjacency matrix $A_G$. In the context of recurrent neural networks we use the largest eigenvalue of $A_G$ (i.e., spectral radius $\lambda$) as a stability constraint for network dynamics. By scaling the network weights ($A_G$) according to $\lambda$, we ensure that a signal does not exceed a certain amplification as it traverses along recurrent paths within the network.
Do other eigenvalues of $A_G$ have direct relations or an explanatory value of further recurrence in $A_G$? In other words, while $\lambda$ allows to scale the macro-level dynamics of the network, can eigenvalues also provide insights into more micro-level aspects of recurrence?
If not eigenvalues, are there other linear algebra measures to analyze recurrence in graphs? (As an alternative to using direct graph theoretic approaches such as shortest path, cycles, etc.)
Consider transforming the adjacency matrix $A$ by dividing each row by its sum to get a matrix $P$, such that $$ P_{ij} = \dfrac{A_{ij}}{\sum_{k} A_{ik}}. $$ You can now interpret $P_{ij}$ as the probability that a "particle" taking a random walk on the graph transitions from node $i$ to node $j$.
This is a Markov chain, and it has a stationary distribution since the state space is bounded, that satisfies $P \pi = \pi$. These correspond to the stationary distributions of the particle's process: how much time in an infinitely long walk does the particle spend at each element of the graph? You can find these using eigenvector decomposition.
So the eigenvectors of the adjacency matrix are essentially transformations of the stationary distributions of the random process on the graph.