Define $\beta=(\beta_0,...,\beta_n)^T$.
I have come upon a problem that I'm not sure how to solve. Say we have a linear model and I want to find the best fit line. I want to minimize the RSS, $\min_\beta[(Y-X\beta)^T(Y-X\beta)]$ where $X$ is the design matrix and the $\beta$ parameter is the factor loadings. The solution for this problem for a full rank $X$ is given uniquely by \begin{align*} \hat{\beta}=(X^TX)^{-1}X^Ty. \end{align*} I can verify this solution by setting the gradient of RSS w.r.t. $\beta$ to zero and solve for $\beta$ but...
How do I find the solution if I apply one or more constraints such as $\beta_j >0$?
You want to minimize a quadratic objective subject to constraints. If your constraints are linear, this is quadratic programming.