About minimizing dynamic of a convex function

36 Views Asked by At

Suppose that we have a strictly convex function $f: \mathbb R \rightarrow \mathbb R$ that admits a unique minimizer $x^*$ and $f(x^*)=0$. Let $x_0< x^*$, clearly $f$ is decreasing on $[x_0,x^*)$ and $f[x_0,x^*) = (0,f(x_0)]$. We already know that there is a bijection $y:[0,\infty) \rightarrow [x_0,x^*)$ such that $y_0 = x_0$ and $y_t \uparrow x^*$ as $t\rightarrow \infty$ (meaning $t\mapsto y_t$ is strictly increasing). By construction, $t\mapsto f(y_t)$ is decreasing with $t$. My question is, can we always reparametrize $y_t$ to have a new bijection $z=(z_t)_{t\geq 0}$ in a way such that

$$ z_t \leq e^{-\lambda t} z_0, \quad \text{or} \quad z_t \leq e^{-e^{\lambda t}} z_0, \quad \text{for some} \ \lambda>0 $$

that still preserves some properties of $t \mapsto y_t$, such as differentiability (we can assume $f$ is differentiable or even smooth)?

This problem arised when I was learning convex optimization. I tried to make a connection to gradient flow associated with a function. People say that gradient flow is the "steepest" descend and I usually see that the bound $\leq e^{-\lambda t}$ is considered to be very good. However, I have never seen an even faster bound like $e^{-e^{\lambda t}}$ before so I want to know if this better bound ever existed in some specific examples? To be more precise, when we know that gradient flow associated to $f$ exists, I want to know if we can reparametrize it to have better bounds or not.

Thank you for your help and comments