Convergence analysis of gradient descent method

419 Views Asked by At

From the following:

Convex Optimization (S. Boyd) p.467

Content:

We will see that the gradient method does in fact require a large number of iterations when the Hessian of $f$, near $x^⋆$, has a large condition number. Conversely, when the sublevel sets of $f$ are relatively isotropic, so that the condition number bound $M/m$ can be chosen to be relatively small, the bound (9.18) shows that convergence is rapid, since c is small, or at least not too close to one.

My question is:
What does "sublevel sets of $f$ are relative isotropic" means?

Does it mean, for example, a diagonal matrix whose diagonal elements are uniformly distributed?

(I know the definition of sublevel sets and the meaning of isotropic, but have no idea what does the combination of them here mean)