I have the following equation $x = e+x C$, which I rewrite to $x = e (I-C)^{-1}$, here $x$ is a $L$-dimensional vector, $e$ is the same dimensional vector of ones and $C$ is a $L \times L$ positive matrix (e.g. $c_{ij} >0$ for all $i,j$).
So now I relate the existence of this inverse to the matrix geometric series. So let $\rho(C)$ denote the spectral radius of $C$, then we know that $(I-C)^{-1}$ exists, if $\rho(C) < 1$. My first question is whether the converse also must hold.
The reasons for this question is me trying to solve the above equation and having a matrix $C$ where $\rho(C) > 1$, but Mathematica (and I know there can be problem here) finds that $\det(I-C)>0$ and Rank$(I-C)=L$. So I thought, there must be a problem here. Also the sequence $C^k$ does not converge, which corresponds to $\rho(C) >1$. However Mathematica also finds and inverse for $(I-C)$ and this actually solves the equation above.
So if the converse is not true, how would I prove that $(I-C)^{-1}$ exists? Note that $L$ is arbitrary large and I am actually interested in what happens when $L$ becomes large, so finding determinants etc is not helpful.
In general a matrix $A$ is invertible if and only if it is non-singular, meaning that there doesn't exist $v\ne 0$ vector such that $vA=0$.
In your case $A=I-C$, so the inverse $(I-C)^{-1}$ exists if and only if for every $v\ne 0$ you have $$v(I-C) \ne 0$$ that is the same of $$v \ne vC.$$
If $\rho(C)<1$, then $v\ne vC$, otherwise $v$ would be an eigrnvector associated to eigenvalue $1$. So this condition ensures the invertibility, but the converse is not true. For example, take $C = nI$ where $I$ is the identity matrix and $n=2,3,4,\dots$. You have that $\rho(C) = n>1$, but $vC = nv \ne v$, so $I-C$ is invertible.