I have been trying to solve the below problem using least squares regression in python. However, my problem lies in adding the constraints. From what I can see from the implementation of least squares in python's scipy package, I can only declare constraints on the unknown variables (x). Linear programming seems to allow for inequality constraints, but I don't think this problem fits into linear programming due to the objective function being non-linear. Does anyone have any advice on how I could go about solving this?
$$ y = min_x \frac{1}{2}||Dx-d||_2^2 $$ $$ s.t. $$ $$ Ax \leq c $$
It will strongly depend on how the problem is posed. Say, if $Ax \leq c$ is a compact region $R$, you can apply Weierstrass Theorem and optimize inside it with the Differential of $y$ and then comparing with the values of the border of the region, as usual. Of course this could mean a lot of work, depending on $R$. If $R$ is not a compact region I would try to check the growth of $y$ along unbounded direction to get a glimpse of what's going on.
If $y$ were linear you have the simplest case since the optimization point lies in a vertex (or segment joining two vertices), but unfortunately this is not your case, as you probably already know according to your remark.
Cheers.