Let $A$ be an irreducible, aperiodic matrix with non-negative entries, with $1 \in \ker(A - I)$, $w \in \ker(A^\top - I)$, $w_i > 0$ $\forall i$. Define $W = \text{diag}(w)$. I am studying the matrix $$ R = W^{-1} A^\top WA; $$ it is clear that $R$ is an irreducible matrix with non-negative entries such that $1 \in \ker(R - I)$ and $w \in \ker(R^\top - I)$. Also $W^{-1} R^H W = R$ where $\cdot^H$ indicates the conjugate transpose. This means that $B = W^{1/2} R W^{-1/2}$ is Hermitian and so has real eigenvalues, and therefore that $R$ has real eigenvalues.
I'd like to know what conditions ensure that the next largest eigenvalue of $R$ after $1$, say $\lambda_2(R)$, satisfies $$ \lambda_2(R)^{1/2} \geq |\lambda_2(A)|. $$ I understand that I have not written anything which guarantees that $\lambda_2(R) > 0$, allowing the application of the square root. Nevertheless, in all of the examples I have tried, the inequality has held. In many cases exact equality held.
Of course things are easier in finite dimensions, and I appreciate any help/comments at all, but ideally I would get conditions that work in the infinite dimensional case.
Thanks!
Here is a proof for the finite dimensional case. Since the OP is primarily interested in the infinite dimensional case, I am not sure if this helps.
With the specified assumptions, the inequality always holds. We need $A$ to be irreducible, but aperiodicity and primitiveness are not required.
Denote the $k$-th largest singular value and the $k$-th dominant eigenvalue of a matrix $X$ by $\sigma_k(X)$ and $\lambda^\downarrow_k(X)$ respectively (so, the second dominant eigenvalue is $\lambda^\downarrow_2(X)$, not $\lambda_1(X)$; the notations in the OP are very non-standard). Let $C=W^{1/2}AW^{-1/2}$. Hence $R=W^{-1/2}C^TCW^{1/2}$ is similar to the positive semidefinite matrix $C^TC$ and all eigenvalues of $R$ are real and nonnegative. Furthermore, the inequality in question (in which we shall take the modulus of $\lambda_2^\downarrow(A)$ because eigenvalues of $A$ can be nonreal) is equivalent to $$\sigma_2(C)\ge|\lambda_2^\downarrow(C)|.$$ To prove this inequality, let $v$ be the entrywise square root of $w$. It is straightforward to verify that $v$ is both a left- and right- eigenvector of $C$ corresponding to the largest eigenvalue $\lambda^\downarrow_1(C)=1$. As $A$ is irreducible and entrywise nonnegative, so is $C$. Therefore, by Perron-Frobenius theorem, $\lambda^\downarrow_1(C)$ is a simple eigenvalue and it must be different from $\lambda^\downarrow_2(C)$. So, if $u$ is an eigenvector of $C$ corresponding to $\lambda^\downarrow_2(C)$, by considering $v^HCu$, we see that $u\perp v$.
Normalise $v$ and $u$ to unit lengths. Let $S_0$ be their linear span over $\mathbb C$. View $C$ as a complex matrix. By Courant-Fisher minimax principle, we have $$ \sigma_2(C) =\max_{\dim S=2} \min_{x\in S,\ \|x\|_2=1} \|Cx\|_2 \ge \min_{x\in S_0,\ \|x\|_2=1} \|Cx\|_2 \color{red}{=} \|Cu\|_2 = |\lambda^\downarrow_2(C)|, $$ where the first maximum is taken over all two dimensional complex vector subspaces $S$ and the equality sign in red is due to the fact that $u,v$ are orthonormal eigenvectors of $C$.