I have a simple QCQP problem to solve:
$\min_{t} x(t)^{T}Ax(t)$ subject to constraints $x(t)^{T}Ax(t) > 1 $ where A is a positive definite matrix and $x(t) \in \mathbb{R}^2$ is some time parameterized curve.
While technically this is a convex QCQP, notice that the cost and the constraint are the same mathematical terms. This means that if I use a gradient descent algorithm, I can ignore the constraint and if the cost goes below or equal to 1, I can simply terminate the algorithm !
But is this theoretically the right way to approach the problem? How can this be explained in terms of the Kuch Tucker optimality conditions? Is there a specific name for a problem of this sort?