I was having a go at implementing the algorithm in the paper Spectral Matting which looks neat.
In it they construct a Matting Laplacian. This is a sparse N×N symmetric matrix, where each row and column represents a pixel in the original image (N is the number of pixels in the image). The only non-zero elements are those corresponding to adjacent pixels, which means it has the following block tridiagonal structure (and each of the little blocks are tridiagonal):

The algorithm depends on finding a few of the smallest eigenvalues (and eigenvectors) for this matrix. As you might guess, for realistic images the matrix becomes huge, and even using ARPACK (which is what they do), it becomes impossibly slow. Also I'd like to avoid using archaic FORTRAN software if possible!
I'm wondering if there is a faster way to find the smallest eigenvalues/vectors.
So far I've found that you can possibly find the inverse quickly using the block Thomas algorithm. And MRRR can be used to find the eigendecomposition of a tridiagonal matrix. Is there a block MRRR method? How does it scale?
I'd appreciate any help.