Is obtaining convergence rate always more difficult and more important than providing proof of convergence for optimization methods?

68 Views Asked by At

In the field of deterministic and stochastic optimization, Is obtaining convergence rate always more difficult and more important than providing proof of convergence for optimization methods?

For example, method like SGD, etc.

1

There are 1 best solutions below

2
On BEST ANSWER

It mostly depends on what is converging. In most cases, proving the convergence of a sequence is easier and `less important' than proving the rate of convergence of this very sequence. But in many cases, the rate and the convergence itself are proved for different sequences.

Imagine that $S = \{ x_k \}_{k=1}^\infty \subseteq \mathbb{R}^n$ is a sequence generated by some algorithm. You can analyze the convegence of $S$, or alternatively, you can analyze $F = \{ f(x_k) \}_{k=1}^\infty$.

In many cases, proving the rate of convergence of $F$ to an optimal value is easier than just proving the fact that the sequence $S$ itself converges to an optimal vector. The usefulness of each proof depends on the application. There are cases in which you are not interested in the distance of $x_k$ to an optimal vector and do not care about the value of $f(x_k)$, and there are other cases in which you care about $f(x_k)$ being small, regardless of how $x_k$ is actually close to an optimal vector.