I have a matrix function $R \mapsto J(R)$ from $\mathbb{R}$ to the set of irreducible matrices with non-negative entries. We can assume that $J(R)$ is $d \times d$, although any solutions that work for infinite dimensional matrices would be greatly appreciated too! The function is continuous with respect to the Euclidean metric on $\mathbb{R}^{d^2}$.
I know that the Perron-Frobenius eigenvalue of $J(\alpha)$ is $1$ for some $\alpha > 0$. What I want to know is, if I find $R > 0$ such that $$ \det(I - J(R)) < \epsilon $$ for some small $\epsilon$, what can I say about how close $R$ is to $\alpha$?
Additional properties of J:
It may also be useful to mention that the non-zero entries of $J$ increase monotonically in $R$.
Thanks very much, and sorry that I couldn't find an apt yet concise title for the question!