I have a data set. A matrix $X$, $1300 \times 20$ and output vector $\mathbf{y} \in \Bbb R^{20}$ $$\mathbf{y} = \begin{bmatrix} 100\\100\\\vdots\\100\end{bmatrix}$$ I am trying to run OLS on this data with $\boldsymbol{\beta} \geq \mathbf{0}$ and additional inequality constraints.
I first attempted to solve without the inequality constraints with a Python function that uses coordinate descent. This converged and this test proved I could find a global minimum. But the code doesn't accept constraints beyond $\boldsymbol{\beta} \geq \mathbf{0}$, so it's not enough for my purposes.
I then tried another Python function that uses sequential quadratic programming (SQP), which didn't converge to the global minimum. It should produce the same result since it's the same problem, but it didn't. You can see the details of my code here:
how to stop fmin_slqsp from converging to local minimum?
I determined it's likely the SQP method reached a local minimum whereas the coordinate descent method didn't.
Why would sequential quadratic programming fail to find global minimum?