Analytical Solutions to Linear Programming with linear and quadratic constraint

149 Views Asked by At

Consider the following minimization problem

$$ \min_x c' x $$

such that

\begin{align*} Ax &= 0 \\ x' x &= 1 \end{align*}

where $x \in \mathbb{R}^n$, $c \in \mathbb{R}^n$ and $A \in \mathbb{R}^{m \times n}$, with $m < n$ and $A$ full-rank.

By using Lagrange multipliers, I was able to show that the solution $x^*$ is given by

$$ x^* = - \frac{1}{|Pc|} Pc $$

where $P = I_n - A' (AA')^{-1} A$, (i.e. $P$ is the orthogonal projection matrix on the kernel of $A$). By $|\cdot|$, I mean the Euclidean norm.

Just to show what I did: the gradient of the Lagrangian with respect to $x$ is

$$ \nabla \mathcal{L} = c - A' \lambda - 2 \mu x $$ where $\lambda \in \mathbb{R}^m$ is the Lagrange multipliers for the constraint $Ax=0$ and $\mu \in \mathbb{R}$ is the multiplier for the constraint $x'x=1$. Then I pre-multiply by $A$ and use the fact that $Ax=0$ to get $\lambda^* = (AA')^{-1} A c$, which can be plugged into the gradient of the Lagrangian to yield $x^* = \frac{1}{2\mu}(c - A' \lambda^*)$, which then easily yields the solution by using the constraint $x'x=1$.

What I would like to know is, for $b \in \mathbb{R}^m$,

  1. is it possible to modify the constraint $Ax=0$ into $Ax=b$?
  2. is it possible to modify the constraint $Ax=0$ into $Ax<0$?

still being able to find a solution analytically.

In the case of a positive answer to both 1. and 2., I would be then be interested in 1. + 2., i.e.: is it possible to modify the constraint $Ax=0$ into $Ax<b$?

By $<$ I mean elementwise inequality.

As far as 1. is concerned, it seems to me that my argument using the Lagrangian function cannot easily be adapted to the case $Ax=b$. On 2. I honestly do not know.