"Batchwise" least squares with smoothness in row direction as extra objective

48 Views Asked by At

My math background is essentially non-existant, so please bear with me.

I have a "batchwise" (for lack of a better term) linear least squares problem $A X = Y$ that I solve like $\hat X = A^\dagger Y$ with the individual observations and fitted samples as column vectors of $Y$ and $\hat X$, respectively. Now I would like to add an extra smoothness objective in the row direction of $\hat X$ (i.e. a penalty proportional to the derivatives of the $\hat X^T_i$ row vectors). I know, that this is no longer a least squares problem. But what is its formal definition? Can it be written as a QP problem and what can be said about convexity? The only sensible approach I can think of is solving it with non-linear optimization which works but is too expensive for the sample sizes I am working with.

When googling, the most promising thing popping up was multi-objective matrix decomposition but the corresponding papers are out of my depth.

Any pointers about how to best approach this would be highly appreciated.