Dual Ascent Method (DAM)

479 Views Asked by At

I recently came across the dual ascent method. The method is part of my question so it is written formally below. $$\min_{x} (f(x))\hspace{1cm}subject\hspace{1mm}to\hspace{1mm}Ax = b $$ The Lagrangian is $$L(x, y) = f(x) + y ^\intercal(Ax − b)$$ the dual function $$g(y) = \inf_{x} (L(x, y))$$ and the dual problem $$\max_{y}(g(y))$$, where y is the dual variable. One method that gives us this solution is the Dual Ascent Method (DAM), characterized at iteration k by computing until convergence: $$1.\hspace{1mm} x^{k+1} := \arg min_x(L(x, y^{k+1})) (minimization \hspace{1mm}for\hspace{1mm} f(x) \hspace{1mm}on\hspace{1mm} x)$$ $$2.\hspace{1mm} y^{k+1} := y^{k} + α^{k}(Ax^{k+1} − b) \hspace{1mm}(update\hspace{1mm}y \hspace{1mm}for \hspace{1mm}next\hspace{1mm} iteration)$$ It may be a easy question but my doubt is how to actually perform the minimization in step 1 is it using partial derivative?