I am looking for some well-known algorithms in which sparse matrix elements are accessed in a non-structured way, i.e. row/column depends on a value of another (sparse) matrix/vector element or some variable that is changing throughout the course of the algorithm.
So, if one was to execute an algorithm on a computer or even by hand on a paper, there should not be many visible patterns in which adjacent non-zero elements in a single row/column are accessed consecutively.
For example, consider a sparse matrix $$ M = \begin{bmatrix} 0 & 1 & 0 & 0 & 2 \\ 0 & 3 & 4 & 0 & 5 \\ 6 & 0 & 0 & 7 & 0 \\ 0 & 0 & 0 & 0 & 8 \\ 0 & 0 & 0 & 0 & 9 \end{bmatrix} $$
I am not interested in algorithms which contain fairly regular element access patterns like:
3 4 5- accesses the non-zero elements in row 2, in order9 8 0 5 2- accesses last column in reverse order1 4 7 8- accesses the superdiagonal, still too structured/regular
but those which involve a lot of "random" patterns like:
6 8 1 9 4 3- accesses only non-zeroes, very interesting3 0 8 5 1 0 0 6- accessing some zero elements is also interesting8 0 2 5 9- permutation of a row/column is still interesting enough
(Of course, the exact access pattern depends on the implementation of an algorithm, but I am looking for those algorithms in which there is no obvious way to access, for instance, an entire row/column of a matrix in (reverse) order.)
I would prefer a deterministic algorithm, but randomized ones are also fine.