I have a problem where I need to solve for vector $w$ using
$\min \frac{{w}'\Sigma_{1}w}{{w}'\Sigma_{2}w}$
$s.t.$
${w}'e = 1$
where $e$ is a vector of ones, $\Sigma_{1}$, and $\Sigma_{2}$ both have the characteristics of co-variance matrices and positive definite.
I am struggling to find an answer.
Here is what I have done so far:
Step 1:
Rewrite the problem as
$\min {w}'\Sigma_{1}w$
$s.t.$
${{w}'\Sigma_{2}w} = r^2$
${w}'e = 1$
where I can minimize the solution over a range of $r^2$.
Step 2:
Minimize the Langrangian of the modified problem in Step 1
$\min {w}'\Sigma_{1}w - \lambda ({w}'\Sigma_{1}w - r) - 2\mu({w}'e = 1)$
which gives:
$(\Sigma_{1} - \lambda \Sigma_{2})w = \mu e$
or, $w = \mu (\Sigma_{1} - \lambda \Sigma_{2})^{-1} e$
Substituting into ${w}'e = 1$ gives $\mu = \frac{1}{e(\Sigma_{1} - \lambda \Sigma_{2})^{-1} e}$ and $w = \frac{(\Sigma_{1} - \lambda \Sigma_{2})^{-1} e}{e(\Sigma_{1} - \lambda \Sigma_{2})^{-1} e}$.
Substituting $w$ into ${{w}'\Sigma_{2}w} = r^2$ gives:
$\frac{{e}'(\Sigma_{1} - \lambda \Sigma_{2})^{-1}\Sigma_{2}(\Sigma_{1} - \lambda \Sigma_{2})^{-1} e}{(e(\Sigma_{1} - \lambda \Sigma_{2})^{-1} e)^2} = r^2$
and I am lost after this, because I'm not sure I can easily solve for $\lambda$ in terms of $r$.
For typing convenience, define the matrices $$\eqalign{ A = \Sigma_1,\quad B = \Sigma_2,\quad M = B^{-1}A \\ }$$ and the scalar functions $$\eqalign{ \alpha &= w^TAw \implies \frac{\partial \alpha}{\partial w} &= 2Aw \\ \beta &= w^TBw \implies \frac{\partial \beta}{\partial w} &= 2Bw \\ }$$ Write the cost function in terms of these new variables.
Then calculate its differential and gradient. $$\eqalign{ \lambda &= \beta^{-1}\alpha \\ d\lambda &= \beta^{-2}\big(\beta\,d\alpha-\alpha\,d\beta\big) \\ \frac{\partial \lambda}{\partial w} &= 2\beta^{-1}\big(Aw-\lambda Bw\big) \\ }$$ Setting the gradient leads to an eigenvalue problem. $$\eqalign{ Aw &= \lambda Bw\quad &({\rm generalized\,EV\,problem}) \\ Mw &= \lambda w\quad &({\rm standard\,EV\,problem}) \\ }$$ Thus the min/max of the cost function equals the minimum/maximum eigenvalue of the $M$-matrix, and the $w$-vectors are the corresponding eigenvectors.
If $B^{-1}$ does not exist, then you must solve the generalized eigenvalue problem instead.
Note that the given normalization condition has no effect on eigenvalue equation or the cost function. For example, scale the vector by a factor of a hundred and substitute it into the cost function. $$\eqalign{ \frac{(100\,w)^TA(100\,w)}{(100\,w)^TB(100\,w)} = \frac{w^TAw}{w^TBw} = \lambda }$$ That being said, after finding an eigenvector, you can certainly scale it to any length that is required.