Is there a closed form solution to the following convex optimization problem?

182 Views Asked by At

I would like to know if there is a closed-form solution to this convex optimization problem:

$$ \min_{\delta \in \mathbb{R}^n} \delta^Tv$$ such that

$$\lVert \delta \rVert_1 \leq k$$ $$\mathbf{1}^T\delta = 0$$ $$\delta + f \geq 0$$

where $v$, $\delta$, and $f$ are vectors in $\mathbb{R^n}$. Furthermore, $v$, and $f$ are known vectors such that $v \leq 0$ (all enteries of v are nonpositive) and $f \geq 0$. Also $k$ is a known constant as well, and $\mathbf{1}$ is a vector of ones.

Since Slater's condition holds, I tried forming the dual of the problem, but I wasn't able to make headway with that problem either.

Here is the dual that I found:

$$ \max_{\lambda \geq 0, \mu}-\lambda^Tf + \min_{\delta: \lVert \delta \rVert_1 \leq k} \delta^T(\log\theta - \lambda + \mu\mathbf{1}) $$

$$ \max_{\lambda \geq 0, \mu}-\lambda^Tf - k \lVert \log\theta - \lambda + \mu\mathbf{1} \rVert_{\infty} $$