Miniziming convex quadratic function over unit simplex — is there a closed form solution?

312 Views Asked by At

Given matrix $A \in \mathbb{R}^{3\times 4}$ and vector $b \in \mathbb{R}^3$, I have the following optimization problem in $w \in \mathbb{R}^4$

\begin{equation*} \min_{w \in \Delta}\frac{1}{2}\lVert Aw - b\rVert_2^2 \end{equation*}

where

$$\Delta := \left\{ w \in \mathbb{R}^4 \mid w \geq 0, \sum_{i=1}^4 w_i = 1 \right\}$$

Could you please help me solve this problem? Is there a closed form solution to this problem?


Motivation

This optimization problem arises as a part of computer vision pipeline and is mainly used to perform co-ordinate transformation. I am trying to process as many frames per second as possible. Having a closed form solution instead of a numerical one would reduce the computation time significantly, in my humble opinion. For more information, see this question.

2

There are 2 best solutions below

0
On BEST ANSWER

Based on what i have read from here, I end up with the following update rule \begin{equation*} (w_{t+1})_i = (w_t)_i \cdot \frac{\exp(-\eta(\nabla_{w_t}f)_i)}{w_t^T\exp(-\eta\nabla_{w_t}f)}, \forall i \end{equation*} In my case, $\nabla_{w_t}f = A^T(Aw_t-b)$. Here $\eta > 0$ is the learning rate and $w_0$ can be chosen as $(0.25, 0.25, 0.25, 0.25)$

The stopping condition would be $\lVert w_{t+1} - w_t \rVert_2 \leq \epsilon$ for some $\epsilon > 0$

6
On

Just use a non negative least squares solver. For example, scipy.optimize.nnls gives the solution vector back directly if you define the matrix $A$ and vector $b$. To have $\Delta := \{ w \in \mathbb{R}^4 | w \geq 0, \sum_i w_i = 1 \}$, just normalize the solution vector afterwards as $\frac{w}{\sum_i^n{w_i}}$ or add a regularization part to your function \begin{equation*} \min_{w \in \Delta}\frac{1}{2}\lVert Aw - b\rVert_2^2 +\lambda||w||_2^2\end{equation*}