This is a problem motivated by spatial autoregressive models. Assume $A$ is an $n\times n$ non-negative matrix with the sum of each row equals to $1$. Let $I$ be an $n\times n$ identity matrix and $c\in\mathbb{R}$. Define \begin{align*} A_1 &= (I-cA)^{-1}(I-cA^\top)^{-1},\\ A_2 &= (I-cA)^{-1}(I-cA^\top)^{-1}(A^\top+A-2cA^\top A)(I-cA)^{-1}(I-cA^\top)^{-1}. \end{align*} When $n$ goes to infinity, can we upper bound the spectral radius $\rho(A_1)$ (or the largest eigenvalue $\lambda_{\max}(A_1)$) and $\rho(A_2)$ by adding some appropriate assumptions (maybe on $A+A^\top$ or $A^\top A$)?
This is my consideration. Assume $\lambda_i$s are the eigenvalue of $A+A^\top - cA^\top A$ for $1\leq i\leq n$. Then $A_1$'s eigenvalue should be $(1-c\lambda_i)^{-1}$. So I guess we should avoid every $\lambda_i$ to be close to $c^{-1}$, but I have no idea how to describe it mathematically.
For $A_2$, we want to show $\lambda_{\max}(U^{-1}VU^{-1})\leq K$ for some K>0, where $U = (I-cA^\top)(I-cA)$ and $V = A^\top+A-2cA^\top A$. I believe It suffices to show that \begin{align*} \quad & \lambda_{\max}(U^{-1}VU^{-1})\leq K\\ \Leftrightarrow & \frac{y^\top U^{-1}VU^{-1} y}{y^\top y} \leq K \quad (\text{for all }y\neq 0)\\ \Leftrightarrow & x^\top V x\leq K x^\top U^2 x\quad (x = U^{-1}y)\\ \Leftrightarrow & V\leq KU^2 \end{align*} in Lowner order. I am not sure if it is the right way.