Given a lower-dimensional vector $x$, a target vector of $y$ and constraints $A$ s.t. $Ax = y$, I'd like to minimise $||Dx||^2$, where $D$ is a matrix to calculate mixed forward and backward differences from x. For example, let $x$ be a 50-by-1 vector, and $y$ a 100-by-1 vector. Then $A$ would be a 100-by-50 matrix containing some linearly dependent constraints.
How could one go about solving this optimisation problem? In code is one thing, but even in principle, I'm a bit confused. I think the simple pseudoinverse isn't enough. Is some sort of iterative approach needed?
Even simpler, how does one minimise $||x||^2$, given these constraints? The $D$ is quite trivial to add.
I'm in a pickle here. This is an assignment, yes, but I was absent from a lecture and we haven't been provided with any additional material. Do these kind of problems have a specific name I could search with to find more information?
Your problem falls into the class of quadratic programming.
According to the section of equality constraint at the solution method, it is equivalent to solving
$$\begin{bmatrix} D^TD & A^T \\ A & O\end{bmatrix}\begin{bmatrix} x \\ \lambda \end{bmatrix}=\begin{bmatrix} 0 \\ y\end{bmatrix}$$