I have a matrix with the following sparsity pattern:
$M = \begin{bmatrix} \ast &\ast &0 &0 &0 &0 &0 &0\\ 0 & 0 &\ast &\ast &0 &0 &0 &0 \\ 0 &0 &0 &0 &\ast &\ast &0 &0 \\ 0 &0 &0 &0 &0 &0 &\ast &\ast \\ \ast &\ast &0 &0 &0 &0 &0 &0\\ 0 & 0 &\ast &\ast &0 &0 &0 &0 \\ 0 &0 &0 &0 &\ast &\ast &0 &0 \\ 0 &0 &0 &0 &0 &0 &\ast &\ast \end{bmatrix}$
where $\ast$ denotes non-zero entries. It is a right stochastic matrix (a transition matrix for a Markov chain; each row sums to one), so I know it has a left eigenvector with eigenvalue 1, i.e., some $x$ for:
$xM = x$
I can use the power-iteration to find the eigenvector $x$. I was wondering if there is a faster way of getting the solution for this special structured matrix. (Also, is there a name for this sparsity pattern?)
EDIT: the matrix could be larger: size of $2^n \times 2^n$.
EDIT2: Note that $M$ is not full rank.
You want the null space. Since you know already that the eigenvalue is $1$, look at $M-I$ . Use QR factorization:
\begin{align} M-I &=QR \\ \Rightarrow Q^\top (M-I) &= R \end{align}
Since $M - I$ is singular, there will be at least one row $v$ in $Q^\top$ that gives $$v (M - I) = \mathbf{0} \Rightarrow vM = v$$
If there is only one such row $v$, you have your answer. Otherwise, your answer is only reduced to a subspace.
I believe that you may have wanted to avoid such a factorization since your matrix is sparse. You mentioned power iteration and I missed that such an iteration may be advantageous with your sparse matrix.
If you first reorder the rows of $M-I$ as follows, then the sparsity of the matrix may help the QR algorithm (one that uses Givens rotations to take advantage of the sparsity) Find the row permutation $P$ that gives $D_2$ a block diagonal (it has $2 \times 2$ blocks):
$$PM = D_2 = \pmatrix{* & * & 0 & 0 & \cdots \\* & * & 0 & 0& \cdots \\0 & 0 & * & *& \cdots \\0 & 0 & * & *& \cdots \\ \vdots & \vdots & \vdots & \vdots & \ddots }$$
Again, look at $M - I$, in your example this would look like $$P(M-I) = D_2 - P = \begin{bmatrix} \ast - 1 &\ast &0 &0 &0 &0 &0 &0\\ \ast &\ast &0 &0 &-1 &0 &0 &0\\ 0 & -1 &\ast &\ast &0 &0 &0 &0 \\ 0 & 0 &\ast &\ast &0 &-1 &0 &0 \\ 0 &0 &-1 &0 &\ast &\ast &0 &0 \\ 0 &0 &0 &0 &\ast &\ast &-1 &0 \\ 0 &0 &0 &-1 &0 &0 &\ast &\ast \\ 0 &0 &0 &0 &0 &0 &\ast &\ast - 1 \end{bmatrix}$$ This ordering gives close to the triangular form that the QR factorization will finish. This may give speed of calculations, but I am not sure that the memory storage could be greatly optimized. It does look like some of the sparsity would remain throughout the Givens rotations though.