Assume that we are going to minimize this objective function:
$$J_{min} = \frac{1}{2}x^TQx + c^Tx$$
With subject too: $$Ax \leq b_{lb} \\ x \geq 0 $$
The objective function have its origins from this equation:
$$Ax = b$$
Where $Q = A^TA$ and $c^T = A^Tb$ for the objective function.
The derivative of the $J_{min}$ with respect on $x$ is:
$$\nabla J_{min} = x^TQ + c^T$$
Now I want to minimize $J_{min}$ by using projected gradient descent.
$$ y = x_k - \alpha \nabla J_{min}(x_k)\\ x_{k+1} = \text{arg} \min_{x \in C} \|y-x\| $$
Where $\alpha$ is a smal number. The intresting thing is this equation:
$$x_{k+1} = \text{arg} \min_{x \in C} \|y-x\| $$ I want to find a $x$ that can minimize $y-x$ to close to $0$ as possible. Can I use Dual Simplex method here?
$$\text{min:} c^Tx$$
With subject too: $$Ax \geq b_{lb} \\ x \geq 0 $$
What should $c^T$ be for this time? $x$ and $y$ in $\|y-x\|$ is both vectors. I can't say that $c^T = [1 -1]$ because $y$ is more like a constant here.