Iterative optimization of $f(x_1,x_2)$ with $f$ strictly convex in $x_1$, $x_2$ but not $(x_1, x_2)$

34 Views Asked by At

I wish to (numerically) compute the minimum $(x_1, x_2)^\star$ of some function $f(x_1, x_2)$ satisfying

  1. Upon $x_2$ fixed, $f(x_1, x_2)$ is a strictly convex function of $x_1$.
  2. Upon $x_1$ fixed, $f(x_1, x_2)$ is a strictly convex function of $x_2$.
  3. $f(x_1, x_2)$ is not a convex function of the joint input vector $(x_1, x_2)$.

under convex (actually linear) constraints on $(x_1, x_2)$.

My question is: Are there any results on whether the update rule (block-coordinate descent)

\begin{align} x_1^{(k+1)} &= \text{argmin}_{x_1} f(x_1, x_2^{(k)}) \\ x_2^{(k+1)} &= \text{argmin}_{x_2} f(x_1^{(k+1)}, x_2) \end{align}

where the subscript $(k)$ denotes the $k$-th iteration step, converges globally (in case, of course, a global minimum exists)?