I'm interested in computing the minimal 2-norm least squares solution for a possibly rank deficient (or numerically rank deficient) large sparse matrix A that's hidden behind some black box routine $F$. ($F(v)$ computes $Av$ for some vector $v$)
What are some efficient methods or resources I can look at for solving this?
Right now I'm only familiar with ways this can be solved with factorizations given a local copy of the matrix. Such as those in Golub/Van-Loan's Matrix Computations: TSVD, Regularized SVD, or Rank Revealing Pivoted QR.
Is there a common way to modify CGNE or CGNR to tackle the rank deficient case? What other iterative methods are there? Is it reasonable to form a randomized approximation to the SVD (such as CUR) and proceed with TSVD from there?
Edit: On further thought I think I could use Tikhonov regularization with CGNR by just replacing the inner product step with $ \langle Ax, Ax \rangle + \langle \Gamma x,\Gamma x \rangle $ (where $\Gamma$ is full rank). But I'm still curious about other methods!