Can I determine a parameter in the objective function of a convex problem, given the optimal solution?

210 Views Asked by At

Considering a convex optimization problem with inequality constraints: \begin{equation} \begin{aligned} \min_{x\in\Re^{n}} & ~x^\top H x + f^\top x + \lambda\sqrt{x^\top R x}\\ \text{s.t.} & ~Ax\leq b \end{aligned}, \end{equation} where both matrices $H$ and $R$ are positive definite. If the optimal solution $x^*$ is given and assume that the non-negative scalar $\lambda$ is unknown, I am wondering, is it possible to estimate the value of the $\lambda$?

Many Thanks!

2

There are 2 best solutions below

7
On BEST ANSWER

Some thoughts:

Let $\theta$ denote the Lagrange multiplier. Then, $(\lambda, x^\ast, \theta)$ satisfies the system \begin{align*} 2Hx^\ast + f + \lambda\frac{Rx^\ast}{\sqrt{(x^\ast)^\mathsf{T} R x^\ast}} + A^\mathsf{T}\theta &= 0, \\ \theta^\mathsf{T}(Ax^\ast - b) &= 0, \\ \theta &\ge 0, \\ \lambda &\ge 0. \end{align*} Since the system above is linear in $(\lambda, \theta)$, we can find a feasible $(\lambda, \theta)$ by using convex programming.

I tested some examples (I use cvx + matlab), and it works.

13
On

Let $\gamma = \sqrt{(x^*)^T Rx^*}$. Notice that $x^*$ is a minimizer for the problem \begin{align} \tag{1} \text{minimize} & \quad x^T H x + f^T x \\ \text{subject to} &\quad Ax \leq b, \\ &\quad \sqrt{x^T R x} \leq \gamma. \end{align}

Solve problem (1) using an algorithm such as an interior point method that will compute a Lagrange multiplier $\lambda$ for the constraint $\sqrt{x^T R x} \leq \gamma$. According to a Lagrange multiplier theorem, which tells us roughly speaking that hard constraints can be replaced with penalty terms in the objective, $x^*$ is optimal for your original optimization problem with the value of $\lambda$ found by solving problem (1).

The CVX or CVXPY software packages make it easy to solve small instances of problem (1), obtaining Lagrange multipliers for the constraints. That might be an easy way to test out this approach.