Dual subgradient method - no analytical relation between primal and dual variables.

338 Views Asked by At

Consider the a convex optimization problem to minimize $f(x)$ subject to the constraint $g(x)\leq0$. Based on the lecture notes Link,Pg 23 , the Lagrange can be expressed as $$L(x,\lambda) = f(x) + \lambda g(x)$$ Suppose $x(\lambda)$ is the minimizer for $\min_x L(x,\lambda)$ ($x(\lambda)$ denotes $x$ in terms of $\lambda$ and this expression is generally obtained by KKT conditions), then the subgradient method to update dual variable $\lambda$ and find optimal dual variable $\lambda^*$ is given by

  • Initialize $\lambda(0)$
  • for $i$
  • $x(i) = x(\lambda(i))$
  • $\lambda(i+1) = [\lambda(i) + \alpha_i g(x(i))]_+$
  • end $i$

where $\alpha_i$ is the step size and $[\cdot]_+=max(\cdot,0)$.

My question is in some cases, it is hard or impossible to find $x(\lambda)$, i.e., no analytical expression for $x(\lambda)$ (This happens a lot when you have multiple nonlinear constraints). In these cases, how does subgradient method work to find the optimal dual $\lambda^*$. If subgradient method fails, any other efficient method to find $\lambda^*$. Any suggestion or reference is appreciated.