Is the optimal solution of a convex problem continuous with respect to parameters?

1.1k Views Asked by At

Given the convex problem

$$\begin{array}{ll} \text{minimize} & \displaystyle\sum_{i=1}^{N} f(x_i)\\ \text{subject to} & Ax = B\\ & 0\leq x_{\min} \leq x_i \leq x_{\max}\end{array}$$

where $f$ is convex, $A \in \mathcal{R}^{M \times N}$ has full row rank, and $N>M$. Let the solution of the problem above be denoted by $x^*$. Also, let the equality condition depend on parameters $a$ and $b$,

$$A(a)x = B(b)$$

If $A(a)$ and $B(b)$ are continuous in $a$ and $b$, is $x^*(a,b)$ also continuous in $a$ and $b$?

1

There are 1 best solutions below

1
On BEST ANSWER

Yes, locally, at least if the problem is strongly convex. Strict convexity may also suffice, but I am not sure about this.

If $f(x)$, $A(a)$, and $B(b)$ were smooth and there were no box constraint, then the result follows from applying the implicit function theorem to the condition that the gradient of the Lagrangian for the problem, $\mathcal{L}$, equals zero (which is the local condition for optimality). You can even get a formula for the tangent vector. Specifically, for $(x^*, \lambda^*)$ to be a solution to the problem, we must have $$ 0 = \begin{bmatrix} \frac{\partial \mathcal{L}}{\partial x} \\ \frac{\partial \mathcal{L}}{\partial \lambda} \end{bmatrix}, $$ where these quantities are evaluated at the point $(x^*, \lambda^*)$. Differentiating this equation with respect to $a$ yields $$ 0 = \frac{d}{da} \begin{bmatrix} \frac{\partial \mathcal{L}}{\partial x} \\ \frac{\partial \mathcal{L}}{\partial \lambda} \end{bmatrix} = \underbrace{\begin{bmatrix} \frac{\partial^2 \mathcal{L}}{\partial x^2} & \frac{\partial^2 \mathcal{L}}{\partial x^2} \\ \frac{\partial^2 \mathcal{L}}{\partial x^2} & \frac{\partial^2 \mathcal{L}}{\partial x^2} \end{bmatrix}}_{\nabla^2 \mathcal{L}} \begin{bmatrix} \frac{dx^*}{da} \\ \frac{d\lambda^*}{da} \end{bmatrix} + \begin{bmatrix} \frac{\partial^2 \mathcal{L}}{\partial x \partial a} \\ \frac{\partial^2 \mathcal{L}}{\partial \lambda \partial a} \end{bmatrix},$$ and you can solve this linear system to determine the tangent vectors $\frac{dx^*}{da}$ and $\frac{d\lambda^*}{da}$. The coefficient matrix $\nabla^2 \mathcal{L}$ is the KKT matrix, which will be invertible due to strong convexity. You can do the same thing to get tangent vectors for changing $b$.

To extend the result to the smooth box-constrained problem, just use a sequence of log barrier penalty functions to approximate the box constraint.

To extend the result to the non-smooth case, just approximate the convex objective function by a sequence of smooth functions that converge to it uniformly. You can form the necessary sequence of smooth approximations using the methods in the following paper:

Azagra, D. "Global approximation of convex functions." (2011). https://arxiv.org/pdf/1112.1042.pdf