constrained rank approximation

118 Views Asked by At

I'm trying to solve a problem similar to this problem. Instead of requiring the diagonals to be 0, I'd like to require columns of the low rank approximation to decrease in value while going down the rows.

I'm not exactly sure where to begin, especially as the referenced stack exchange post has no replies. I've read this paper which seems relevant, but I've been unable to translate it to my example (which seems comparatively simple).

Edit with more detail:

I'd like to minimize $||A-\hat A||$ where $A$ is my original data of full rank, $\hat A$ is a rank k matrix, and $||.||$ is the Frobenius norm. I'd like to add the constraint that columns decrease in value as you go down rows.

I've been able to achieve this by a truncated singular value decomposition followed by iterating a constrained least-squares eq. on every column, but I'd like to solve this as one problem. The constrained l.s. problem I've been using is as follows: $min||x-d||$ such that $Cx \le 0$ where d is a vector of data, x is the modeled data, and C is a matrix that finds the difference along x when multiplied.

$$ C=\pmatrix{ -1 & 1 & & &\\ & -1 & 1 & &\\ & & & ... &\\ & & & -1 & 1}, $$

1

There are 1 best solutions below

0
On

Minimize $\sum_{ij} |A_{ij} - X_{ij}| $ with the constraints $X_{ij} \ge X_{i+1,j}$ with linear programming. This is different from minimizing the usual sum-of-squares, but may be adequate.

(There are programs and preprocessors that ease the task of expanding a few lines like the above to a large flat LP. Julia JuMP can run various LP solvers, opensource, academic opensource, commercial; I gather these have very different runtimes, very problem-dependent. JuMP looks nice, but today (January 2019) work-in-progress. See also questions/tagged/julia-jump+or+linear-programming on SO.)