So I need to use gradient projection with the point $x = (2,0)$ and a step size of $\frac{1}{2}$ for the problem. \begin{equation} \begin{aligned} \min \quad & \frac{1}{2}x_1^2 + \frac{1}{2}x_2^2 \\ \textrm{s.t.} \quad & 2 - x_1 - x_2 = 0 \\ \end{aligned} \end{equation}
Projected gradient descent is defined as $x_{k+1} = \prod_X (x_k - \tau_k \nabla f(x_k))$ where $\prod_X(x)$ is orthogonal projection of $x$ on $X$ and $\tau_k$ is the step size.
So I attempted my first iteration and with $x_k = (2, 0)^T$ as my starting point $\tau_k = \frac{1}{2}$ as my step size. But this is the step where I am stuck $\prod_X\begin{pmatrix} 1 \\ -1 \end{pmatrix}$. I don't know how to compute the orthogonal projection.