Unfortunately, I was not able to find appropriate literature that describes the computational complexity (in O notation) for computing the spectral radius of a square matrix (not symmetric!). In particular, my matrix is of the form $A \in \mathbb{R}^{N \times N}$, where $N$ is a finite integer.
While there is some literature on graph theory, and therefore, for symmetric matrices (adjacency matrices), in the case of non-symmetric square matrices my search was unsuccessful.
I would be very grateful for any leads, also regarding spectral radius approximations.
The eigenvalues of an $N$ by $N$ matrix (and thus the spectral radius) can be computed in $O(N^{3})$ time by Francis's algorithm (aka the implicitly shifted QR method.)
The Arnoldi method for computing the largest magnitude eigenvalue might be faster than computing all of the eigenvalues, particularly if the matrix is sparse. If the matrix is dense, then each iteration takes $O(N^2)$ time, but the number of iterations depends on the spectrum of $A$ and can be bad in some cases. I'm not aware of a worst-case analysis of this, but practically, this is another $O(N^{3})$ method for dense matrices.
You can play with the two approaches in MATLAB using:
versus
For random dense matrices from 1000 by 1000 up to 10000 by 10000, eigs() is roughly twice as fast as eig() but both are scaling as $O(N^{3})$.
The results returned by eigs() with default tolerance are accurate to about 5 digits. Furthermore, there's quite a bit of variance in the time taken by eigs() depending on the particular matrix.