Fixed point and gradient descent

240 Views Asked by At

Suppose $f:[0,1]^d \to [0,1]^d$ is a smooth function. Is there any theorem comparing the convergence of the sequences obtained by (1) repeatedly applying (iterating) $f$ and (2) performing gradient descent on $\mathbf{x} \mapsto \|\mathbf{x}-f(\mathbf{x})\|^2$ starting from an arbitrary point $\mathbf{x}_0$ ?

1

There are 1 best solutions below

0
On

In my humble opinion, to keep the convergence, Contraction mapping only works when $f$ is $k$-Lipschitz function where $k < 1$. In this sense, the convergence rate is $k^n$. However, for gradient descent, this objective function may not be convex, and we can not guarantee the local minima is fixed point.