Convergence of gradient descent method with non Lipschitz gradient

768 Views Asked by At

I would like to study the convergence of the gradient descent method applied to the function

$$f(x)=|x-1|^3$$

In order to do that, I was thinking about using the following theorem:

Assume that $f : \mathbb{R}^n \to \mathbb{R}$ is convex and differentiable. Suppose the gradient of $f$ is Lipschitz continuous with constant $L>0$, meaning

$$\| \nabla f(x) - \nabla f(y) \| \leq L \| x-y \| $$

for any $x, y \in \mathbb{R}^n$. Then, the gradient descent with fixed step size $t \leq \frac{1}{L}$ satisfies

$$f(x^{(k)})-f(x^{*}) \leq \frac{\|x^{(0)}-x^{*}\|^2}{2 t k}$$

that is the gradient descent has convergence rate $O(\frac{1}{k})$.

My problem is that the gradient of the function I am studying is not Lipschitz, unless we put ourselves on a bounded interval, say $(1, a)$.

Could you help me? Any insight is very much appreciated :)