I am trying to find closed form solution to following problem
\begin{equation} \begin{array}{c} \text{min} \hspace{4mm} \big(\lambda_1\left( \mathbf{y}^T V^{(1)}\mathbf{x} \right)^2 + \lambda_2\left( \mathbf{y}^T V^{(2)}\mathbf{x} \right)^2\big) \\ s.t \hspace{10mm}\|\mathbf{x}\|_2 = 1 \\ \hspace{17mm}\|\mathbf{y}\|_2 = 1, \end{array} \end{equation}
where $\mathbf{x} \in \mathbb{R}^{m}$, $\mathbf{y} \in \mathbb{R}^{n}$, and $\mathbf{V^{(i)}} \in \mathbb{R}^{n\times m}$. Also $\lambda_1\geq \lambda_2\geq 0$.
I know that left- and right-singular vectors corresponding to least singular value $\sigma_m^{\mathbf{i}}$ of $\mathbf{V^{i}}$ will minimize the individual terms in the above objective function.
I have two questions here 1) How can I prove that this is a convex optimization problem (I don't have a clue about this). 2)Should I use singular-vector pair that minimize one of the terms in above objective function as a starting point for finding minimizer to the objective through some iterative algorithm like steepest descent or is there another way of getting to closed form solution.
Any constructive suggestion/critique is more than welcomed. Thank you
I would suggest an alternating minimization approach.
For $k=0,1,2,\dots$ until convergence:
Fix $y$ and minimize over $x$ to find $x_+$: $$x_+ = \mathop{\textrm{argmin}}_{\|x\|=1} \left\| \begin{bmatrix} \sqrt{\lambda_1}y_k^T V^{(1)} x \\ \sqrt{\lambda_2}y_k^T V^{(2)} x\end{bmatrix} \right\|$$ $x_+$ is just the right singular vector associated with the minimum singular value of that $2\times m$ matrix.
$\alpha$ and $\beta$ are step sizes you can toy with. Don't be afraid to try $\alpha=\beta=1$ right off the bat, but I am guessing you are going to need to be conservative, or maybe even do line searches. You might also consider multiple steps of $x$, then multiple steps of $y$.
There are no guarantees here. This might work, it might be lousy. But your problem is non-convex, and you cannot expect guarantees.