I have an ill-conditioned matrix G representing Green's function estimates of a strain model for a rock mass that has been instrumented with optical fiber measuring strain. My problem is seemingly the classic: $$Gm = d$$ where $G$ is $m x n$ and $m << n$ (typically 300x10000), $d$ is a noisy data vector of 300 measurements and I'd like to solve for the model estimates $m$.
I have tried vanilla least-squares, but the noise in the data, ill-conditioning of G, and under-determined system all combine to give poor results for the solutions in $m$. Recently, I've been trying to regularize by adding the penalty term $\lambda||x||^2$ to the least-squares objective function (ridge regression). This seems to work better than the vanilla least-squares, but I still don't have much confidence in the results.
I have no reason to think LASSO or Elastic Net would work because I don't expect the model estimates to be sparse.
Are there other options? Should I try a different method of inverting G? Are there other minimization algorithms I can try? I've verified the python least-squares routine with my own SVD for the matrix inversion and get the same results (so at least my SVD routine is working!!). Perhaps QR factoring?
How can I measure the quality of the solutions for more confidence?