Consider the following optimization problem.
$$\begin{array}{ll} \text{minimize} & (x^2-1)^2+y^2\\ \text{subject to} & x^2 - 4 \le 0\\ & x + y \le 0\end{array}$$
Using KKT conditions, find the optimal solution.
Solution: If one draw the region and the objective function then we clearly see that $\overline x=(\frac{1}{2},-\frac{1}{2})$ is the optimal solution.
And the rest it is just calculations and verifications of KKT conditions. So we can verify algebraically that $\overline x$ is the optimal solution.
Question:
Suppose we are not lucky enought to draw the region and objective function, so we are not able to find the solution geometrically.
What to do in this cases?
If you check KKT theorem (Confusion about definition of KKT conditions) , it does not explicitly says how to find the point that will be the optimal solution, it just states that if a point $x$ is fesible and f pseudoconvex at $x$ and there exists scalars such that ... then $x$ is optimal.

Although the question clearly asks one to use the Karush-Kuhn-Tucker (KKT) conditions, one can do without them. Note that the feasible region can be parameterized as follows
$$\begin{bmatrix} x\\y\end{bmatrix} = u \begin{bmatrix} 2\\-2\end{bmatrix} + v \begin{bmatrix} 0\\-1\end{bmatrix}, \qquad u \in [-1,1], v \geq 0$$
Since $u \in [-1,1]$, use $u = \sin (\theta)$. Since $v \geq 0$, use $v = t^2$. Let us use SymPy to do the substitutions and find where the gradient vanishes. The following Python script
produces the following set of $8$ candidate extremizers
Note that
(-1.0, 0)and(-1.0, 0.e-125)are actually the same. Thus, we have $7$ candidates. Borrowing the pretty plot in Cesareo's answer, these $7$ candidates are plotted below.Plotting the objective over the feasible region, we conclude that $(-1,0)$ is the global minimizer.