Optimizing $\rm \frac{x^TAx}{c^T x}$ subject to $\rm1^T x = 1$

570 Views Asked by At

I need to optimize the following quadratic-over-linear objective:

$$ \frac{x^TAx}{c^T x} $$

subject to

$$\mathbf{1}^Tx = 1$$

Where $A$ is a diagonal (with all positive entries ) matrix and $c$ (also with strictly positive entries ) and $x$ are vectors. $\mathbf{1}$ is a vector of all ones.

There is a problem in Boyd and Vandenberghe book that solves the unconstraint problem but the solution I have found is not constructive and I am not able to generalize to the constraint case.

I have tried to follow the standard approach, computing the gradient of the Lagrangian and making it equal to zero but I am not able to get the solution. I get stuck with some cubic matrix equation.

Do anybody know if this is a typical problem in any field??

By the way, I am trying to find a closed form solution because I need it to use somewhere else.

1

There are 1 best solutions below

2
On BEST ANSWER

$\def\p#1#2{\frac{\partial #1}{\partial #2}} \def\o{{\tt1}} \def\a{\lambda} \def\b{\ell}$

Use an unconstrained vector $w$ to construct $x$ such that it automatically satisfies the constraint. Treat its the length and direction as independent variables. $$\eqalign{ \b &= \o:w\quad&\implies\quad&d\b&=\o:dw \\ x &= \b^{-1}w\quad&\implies\quad&dx&=\b^{-1}dw - w\b^{-2}d\b \\ &&&&= \b^{-1}(I-x\o^T)\,dw \\ }$$ Consider the numerator/denominator of the objective function. $$\eqalign{ \alpha &= A:xx^T \quad&\implies d\alpha = 2Ax:dx \\ \beta &= c:x &\implies d\beta = c:dx \\ \a &= \beta^{-1}\alpha &\implies \beta\a = \alpha \\ }$$ In the above, a colon has been used to denote the inner product $$A:B\;=\;{\rm Tr}(A^TB)\;=\;\sum_{i=1}^m\sum_{j=1}^n A_{ij}B_{ij}$$ (which is also valid for vectors if you treat them as rectangular matrices)

Calculate the gradient of the objective function with respect to the unconstrained variable by substituting the above fragments. $$\eqalign{ d\a &= \beta^{-2}(\beta\,d\alpha-\alpha\,d\beta) \\ &= \beta^{-1}(2Ax-\a c):dx \\ &= \beta^{-1}(2Ax-\a c):\b^{-1}(I-x\o^T)\,dw \\ &= (\beta\b)^{-1}(I-\o x^T)\,(2Ax-\a c):dw \\ \p{\a}{w} &= (\beta\b)^{-1}(I-\o x^T)\,(2Ax-\a c) \\ }$$ Set the gradient to zero and solve. $$\eqalign{ 2Ax + \a\beta\o &= 2\alpha\o + \a c \\ 2Ax - \a\beta\o &= \a c \\ \left(\tfrac 2\a A - \o c^T\right)x &= c \quad \implies x = \left(\tfrac 2\a A-\o c^T\right)^{-1}c \\ }$$ The Sherman-Morrison formula for the inverse of a rank-1 perturbation is $$\eqalign{ \Big(B+uv^T\Big)^{-1} = B^{-1} - \frac{B^{-1}uv^TB^{-1}}{1+v^TB^{-1}u} }$$ Substitute $B=\left(\tfrac 2\a A\right),\,u=\o,\,v=-c\;$ to obtain $$\eqalign{ \Big(\tfrac 2\a A - \o c^T\Big)^{-1} &= \tfrac\a 2 \left(A^{-1} + \frac{A^{-1}\o c^TA^{-1}}{\tfrac 2\a-c^TA^{-1}\o}\right) }$$ and to reduce clutter for the next calculation, define the scalars $$\theta=c^TA^{-1}c,\quad\rho=c^TA^{-1}\o,\quad\mu=\o^TA^{-1}\o$$ Now use the constraint to evaluate $\a\;$ (via a scalar quadratic equation). $$\eqalign{ \o^Tx &= \tfrac\a 2 \left(\o^TA^{-1}c + \frac{\o^TA^{-1}\o\;c^TA^{-1}c}{\tfrac 2\a-\rho}\right) \\ 2 &= \a\rho+\left(\frac{\a\theta\mu}{\tfrac 2\a-\rho}\right)\left(\frac \a\a\right) \\ 0 &= (\theta\mu-\rho^2)\a^2 + 4\rho\a - 4 \\ \a_\pm &= \left(\frac{2}{\rho\pm\sqrt{\theta\mu}}\right) \\ }$$ Substituting yields two possible $x$ vectors $$\eqalign{ x_\pm &= \left(\frac{1}{\rho\pm\sqrt{\theta\mu}}\right) A^{-1}\left(c\pm\sqrt{\frac{\theta}{\mu}}\o\right) \\ x_\pm &= \frac{A^{-1}c\pm\Big(\frac{c^TA^{-1}c}{\o^TA^{-1}\o}\Big)^{\frac 12}A^{-1}\o}{c^TA^{-1}\o\pm\Big((c^TA^{-1}c)\;(\o^TA^{-1}\o)\Big)^{\frac 12}} \\ }$$