Relations between convex objective function and norm between optima.

24 Views Asked by At

I try to prove that the $L_2$ distance between current solution and optima decreasing when objective function decreasing. It's like: $$ f(x^{(t+1)}) \leq f(x^{(t)}) \Rightarrow \|x^{(t+1)}-x^*\|_2 \leq \|x^{(t)} - x^*\| $$ where $f$ is convex and the gradient of $f$ is $L$-Lipschitz continous. It looks quite clear. However, I can't prove it. Please tell me quick insight of it. Thanks!

1

There are 1 best solutions below

2
On BEST ANSWER

It's untrue. Consider a function like $f(x_{1},x_{2})=10000000x_{1}^{2}+x_{2}^{2}$.