I am trying to find the $k, \vec{w}$ which are argmin of the following functionality:
$||\vec{y} - k (X \cdot \vec{w})||_2$ such that $\vec{1}^T \cdot \vec{w} = 1$, where $k$ is just a scalar value
I have obviously tried Lagrange multipliers - but the resulting system of equations is not linear & I did not manage to find the unique solution.
Questions:
- What is the best way to prove that the problem is convex (or potentially not convex)?
- Is there another way to find the theoretical solution
- Is numerical optimization the only way to go here?
Any help would be greatly appreciated!
We can equivalently consider $\min_{\beta} \|y - X \beta\|_2$ with the constraint $1^\top \beta \ne 0$. (If we find a minimizer $\tilde{\beta}$, then we can take $k = 1^\top \tilde{\beta}$ and $w = \tilde{\beta} / (1^\top \tilde{\beta})$.) The constraint set is not a closed convex set, which leads to some issues.
Consider the unconstrained problem $\min_\beta \|y - X \beta\|_2$ (no constraints on $\beta$).
Suppose there exists a minimizer $\hat{\beta}$ to the unconstrained problem such that $\vec{1}^\top \hat{\beta} \ne 0$. Then this will also be a minimizer of the constrained problem, and we are done.
However, a problem arises if all minimizers $\hat{\beta}$ of the unconstrained problem satisfy $\vec{1}^\top \hat{\beta} = 0$. Excluding the uninteresting case where $X$ is the zero matrix, we will have a situation where minimizing the constrained problem will drive us to choose $\tilde{\beta}$ close to a solution $\hat{\beta}$ to the unconstrained problem. But we can choose $\tilde{\beta}$ arbitrarily close to $\hat{\beta}$ and drive $\|y - X \tilde{\beta}\|_2$ smaller and smaller approaching $\|y - X \hat{\beta}\|_2$ but we can never reach it while staying within the constraint $\vec{1}^\top \tilde{\beta} \ne 0$.