Suppose there is a $n$-dimensional column vector $\mathbf{x}$, and an objective function $f(\mathbf{x}) = (a \mathbf{x} + c)(b \mathbf{x} + d)$, where $a$ and $b$ are $n$-dimensional row vector; $c$ and $d$ are scalers.
We want to find the max and min of $f(\mathbf{x})$ given the following constraints:
$\mathbf{x}$ is constrained by $\{x_i | -1 \le x_i \le 1 \}$, and $0 \le a \mathbf{x} + c \le 1$ and $0 \le b \mathbf{x} + d \le 1$.
Since $a^T b$ has a determinant of 0, $\mathbf{x}^T a^T b \mathbf{x}$ is neither convex or concave, and it cannot be solved by a general convex optimization. Is there a way to solve this problem efficiently, or programmatically?
Thank you
For maximum, it is equivalent to the convex optimization problem below \begin{align*} &\max_{x, u, v} \quad \sqrt{uv}\\ &\mathrm{s.t.}\quad\ ax + c \ge u, \\ &\qquad\quad bx + d \ge v, \\ &\qquad\quad u, v \ge 0, \\ &\qquad \quad 0 \le ax + c \le 1, \\ &\qquad\quad 0\le bx + d \le 1, \\ &\qquad\quad -1 \le x_i \le 1, \forall i. \end{align*}