How can I find the minimizer of the following optimization problem?

699 Views Asked by At

I want to know whether the following optimization problem falls under "convex optimization". The problem at hand is minimizing the following objective function, i.e., $$ \left\{\min_{\boldsymbol{\beta}\in \mathbb{R}^{d}} \sum_{i=1}^{n} \left(y_{i}-\boldsymbol{\beta}^{T}\mathbf{A}\boldsymbol{\beta}\right)^{2}: y_{i}\in \mathbb{R}, \mathbf{A}\in \mathbb{R}^{d\times d} \right\} $$ where $\mathbf{A}$ is symmetric but not positive definite. I wish to find the minimizer $\boldsymbol{\beta}$ of the above optimization problem. At first glance, I thought because $f(\boldsymbol{\beta}) = \boldsymbol{\beta}^{T}\mathbf{A}\boldsymbol{\beta}$ is quadratic and $g(x) = x^{2}$ is also quadratic which are both convex, the composition $g \circ f$ will also be convex. How can I find the minimizer $\boldsymbol{\beta}$?

1

There are 1 best solutions below

0
On BEST ANSWER

This particular instance should be trivial to solve. Let $x = \beta^TA\beta$. Solve the scalar least-squares-problem that arises in $x$. Denote that solution $x^{\star}$. Let $v$ be any vector such that $x^{\star}$ and $v^TAv$ have the same sign (e.g., if positive let $v$ be an eigenvector associated to a positive eigenvalue). Now scale that vector suitably and pick $\beta = \frac{\sqrt{|x^{\star}|}}{\sqrt{v^TAv}}v$.

I assume $A$ is indefinite and thus has both negative and positive eigenvalues. If not, you would have to add constraints $x\geq 0$ or $x\leq 0$ in the first problem.