Monotonicity of projected gradient descent on a closed convex set

284 Views Asked by At

Let $C\subset R^n$ be a closed convex set, $f:\mathbb{R}^n\to\mathbb{R}$ differentible.

The update step of the projected gradient descent algorithm is $$x^{t+1}=P_C(x^t-\alpha\nabla f(x^t))$$ where $P_C(\cdot)$ is the projection operator on $C$.

I am trying to prove the monotonicity property, i.e. that for $\alpha$ small enough it holds that $$f(x^{t+1})\leq f(x^t)$$

What i have so far.

The projection operator has the properties that $$(x-P_C(x))^T(z-P_C(x))\leq 0,\ \forall z\in C$$ $$||P_C(x)-P_C(y)||\leq||x-y||$$ So $$(x^t-\alpha\nabla f(x^t)-x^{t+1})^T(x^t-x^{t+1})\leq 0\ \to$$ $$||x^t-x^{t+1}||^2-\alpha\nabla f(x^t)^T(x^t-x^{t+1})\leq 0\ \to$$ $$\nabla f(x^t)^T(x^t-x^{t+1})\geq\frac{1}{\alpha}||x^t-x^{t+1}||^2$$ By differentiability of $f$. $$f(x^{t+1})-f(x^t)=\nabla f(x^t)^T(x^{t+1}-x^t)+o(||x^{t+1}-x^t||)$$ Also $o(||x^{t+1}-x^t||)=o(\alpha)$ because $$||x^{t+1}-x^t||=||P_C(x^t-\alpha\nabla f(x^t))-P_C(x^t)||\leq\alpha||\nabla f(x^t)||$$ So from these $$f(x^{t+1})-f(x^t)\leq-\frac{1}{\alpha}||x^t-x^{t+1}||^2+o(\alpha)=-\alpha(\frac{1}{\alpha^2}||x^t-x^{t+1}||^2-\frac{o(\alpha)}{\alpha})$$ So i need to show that $\frac{1}{\alpha^2}||x^t-x^{t+1}||^2-\frac{o(\alpha)}{\alpha}\geq 0$ for $\alpha$ small enough.

If i have a positive lower bound $c$ independent of $\alpha$ $$\frac{1}{\alpha^2}||x^t-x^{t+1}||^2\geq c$$ or if i can show that $$\lim_{\alpha\to0}\frac{1}{\alpha^2}||x^t-x^{t+1}||^2=c>0$$ Then the proof will be complete. But i don't see how to get to one of these results. Or maybe you have another way to prove (or disprove) monotonicity of the algorithm.