Let $f:\mathbb{R}^n \longrightarrow \mathbb{R}$ be a convex and differentiable function and let $S$ be a nonempty, closed, and convex subset of $\mathbb{R}^n$.
The function $g:\mathbb{R}^n \longrightarrow \mathbb{R}^n$ is defined by: $$ g(x) := -\big[ \nabla f\big(P_S(x)\big) - P_S(x)\big], \;\; \forall x \in \mathbb{R^n} \tag{1} $$ Where $P_S(x)$ is the projection of $x$ onto the set $S$.
Prove that if $x$ is a fixed point for $g$ then its projection to $S$ minimizes $f$ over $S$.
ATTEMPT
Since $x$ is a fixed point for $g$ then we have $g(x)=x$.
Therefore rearrange $(1)$ to obtain:
$$
\nabla f\big(P_S(x)\big) = P_S(x) - x \tag{2}
$$
Case 1: $(x \in S)$
We have that $P_S(x) = x$ and thus $\nabla f(x)=x-x=0$.
Since $f$ is convex then $x=\text{argmin}_{x \in S} f(x)$, as desired.
Case 2: $(x \notin S)$
It is possible to write $(2)$ as:
$$
\nabla f\big(P_S(x)\big) = P_S(x) - x = d_S(x) = \inf_{s\in S} \Vert x-s\Vert \tag{3}
$$
Where $d_S$ is the distance relative to $S$.
I am frankly stuck, and unsure what to do. Moreover, it is not very clear to me how to use the closedness and convexity of $S$.
Any help is greatly appreciated.
I will focus on case $2$.
We will use the following result:
Suppose $S \subseteq \mathbb{R}^n$ is non-empty and convex, Let $h$ be a convex function, then $x^*$ is a global minimum if and only if $$\nabla h(x^*)^T(y-x^*) \ge 0, \forall y \in S.$$
Closeness of the set ensure that $P_S(x) \in S$.
We first establish a property of $P_S(x)$: $P_S(x)$ is the optimal solution to $\min_{z \in S}\|z-x\|^2$, hence by the result, we have $$2(P_S(x)-x)^T(y-P_S(x)) \ge 0 ,\forall y \in S.$$
Let's now focus on the condition that $f$ being optimal on $S$ for our particular problem. We have $\forall y \in S$,\begin{align} \nabla f(P_S(x))^T(y-P_S(x))=(P_S(x)-x)^T(y-P_S(x)) \ge 0 \end{align}
Hence $P_S(x)$ is optimal.