Motivation
I am trying to explore the idea of constraining the approximation of an image represented by an $m$-by-$n$ matrix $A$ by the values on a linearly-spaced interval of $mn$ elements $L$ generated over the unit interval $I=[0,1)$.
Goal
I want to use each element of $L$ once and only once in filling an $m$-by-$n$ matrix $B$. I want to minimize the function
$$f = \sum_{i=0}^m\sum_{j=0}^n{|A_{ij}-B_{ij}|}$$
so each $B_{ij}$ is constrained by the minimization of $f$ and by the bijection induced by its uniqueness in $L$.
Naive Approach
Evaluate the error for all $(mn)!$ choices of $B$.
Question
Computationally-speaking, what is the best algorithm for finding $B$? What is the computational complexity class of that algorithm? How hard is this to do in an approximate way? Is there a greedy algorithm for this?