I'm looking for a way to compute the pseudo-inverse of a matrix (not the Moore-Penrose, but any other) with the minimal number of non-zero entries (maximum number of zero entries).
In MATLAB, the the mldivide() operator does that, but I cannot find any documentation on how this operator evaluates my matrix so that it inds this particular pseudo-inverse. Is there a way to do this "by hand"?
Note: I know, Moore-Penrose minimize the Euclidean norm, but I explicitely do not want to minimize that norm, but to maximize the zero-entries in my pseudo-inverse.