$\|\cdot\|$ is Euclidean norm. I minimize a convex and differentiable function $f : \mathbb{R}^d \rightarrow \mathbb{R}$ with constraint. By applying the projected gradient method I get this :
\begin{equation} x_{k+1} = \arg\min_{y: \| y - x_1 \| \leq D} \{ \frac{1}{2} \| y - x_k + \eta \nabla f(x) \|^2\} \end{equation} In the slides on the internet, the authors, after simplification obtain
\begin{equation} x_{k+1} = \arg\min_{y: \| y - x_1 \| \leq D} \{ \frac{1}{2} \| y - x_k\|^2 + \eta \langle \nabla f(x), y - x_k \rangle \} \end{equation}
Me, when I develop I get this :
\begin{align} x_{k+1} &= \arg\min_{y: \| y - x_1 \| \leq D} \{ \frac{1}{2} \| (y - x_k) + \eta \nabla f(x) \|^2\} \\ &= \arg\min_{y: \| y - x_1 \| \leq D} \{ \frac{1}{2} (\ \| y - x_k\|^2 + 2\eta \langle \nabla f(x), y - x_k \rangle + \eta^2 \langle \nabla f(x), \nabla f(x) \rangle \ ) \} \\ &= \arg\min_{y: \| y - x_1 \| \leq D} \{\ \frac{1}{2} \| y - x_k\|^2 + \eta \langle \nabla f(x), y - x_k \rangle + \frac{\eta^2}{2} \langle \nabla f(x), \nabla f(x) \rangle \ \} \end{align} As you can see, I have a third term $\frac{\eta^2}{2} \langle \nabla f(x), \nabla f(x) \rangle $ that stands out. Is my method correct? What demonstration makes it possible to obtain the results of these authors?
Thank you !
The minimization is over $y$, and thus the third term can be viewed as a constant as it does not depend on $y$. Obviously $\arg\min_y \{f(y) + \mathrm{const} \}= \arg\min_y f(y)$.
(And it seems you made a typo: should be $\nabla f(x_k)$ instead of $\nabla f(x)$)