Gradient of a Lagrange dual function

1.4k Views Asked by At

Consider: $$\min_{x \in \mathbb{R}^n} f(x)$$ $$\ \ \ \ \ \ \ \text{s.t. }\ h(x) \leq 0$$

  1. Lagrangian:$\ \ \ L(x,\lambda) = f(x) + \lambda h(x)$
  2. Suppose $x^* = \arg\min_{x} L(x,\lambda)$
  3. Suppose Lagrange dual function $g(\lambda) = \inf_x L(x,\lambda)$
  4. Suppose $f,h$ are all convex function.

Is it true: $\ \ \ \nabla _\lambda(g(\lambda)) = h(x^*)$

Obviously $x$ is a function of $\lambda$, so $x(\lambda)$.

Next, consider an simple example:

$$\min_{x \in \mathbb{R}} x^Tx$$ $$\ \ \ \ \ \ \ \text{s.t. }\ Ax \leq b$$
1. $x^* =\arg\{\nabla_x L(x,\lambda)\} = -\frac{1}{2}A^T\lambda$
2. $h(x^*) = -\frac{1}{2}AA^T\lambda - b$
3. $\nabla_\lambda(g(\lambda)) = -\frac{1}{2}AA^T\lambda - b$

Clearly, the equality holds in this example. However, does this equality alwayls hold?

1

There are 1 best solutions below

1
On BEST ANSWER

Regarding the differentiability of the dual function $g(\lambda)$, here is a sufficient condition (See Lemma 6.3.2, Theorem 6.3.3 and Theorem 6.3.4 of Bazaraa et al. 's Nonlinear Programming: Theory and Algorithms, 2006).

Let the domain $X$ be a nonempty compact set in $R^n$, and $f$ and $h$ be continuous. For any $\lambda$, if the minimizer of $L(x, \lambda)$ is unique (say, $x^*$), then the dual function $g$ is differentiable at $\lambda$ with $\nabla g(\lambda) = h(x^*)$. Otherwise, if the minimizer of $L(x, \lambda)$ is not unique (but nonempty), then $g$ is only subdifferentiable and any $\tilde{x} \in \arg \min_x L(x, \lambda)$ yields a subgradient of $g$ at $\lambda$: $h(\tilde{x}) \in \partial g(\lambda)$.