I want to find the solution of a Lagrange equation whose inputs are matrices.
First I have the equation Ax=0. By decomposing $A$ into $A_3$ (columns 9 to 11 of A), $A_9$ (the rest of the columns), and the normalization constraint $m_{31}^2+m_{32}^2+m_{33}^2=1$, I obtain the Lagrange equation with the closed-form linear solution.
How can I solve this in MATLAB?
Thank you!
Edit: I found two solutions. The first one uses SVD, and it takes the last column of the matrix V, but it outputs 0, which is what I'm avoiding. The second solution is to compute $m_3$ as the eigenvector of the matrix from the right-hand side of the first equation from (8). Would the second solution be a correct one?
We have a homogeneous linear system in $x \in \mathbb R^n$
$$A x = 0_m$$
where $A \in \mathbb R^{m \times n}$. We also have constraints on the last $n_2$ entries of $x$. Hence, we write
$$\begin{bmatrix} A_1 & A_2\end{bmatrix} \begin{bmatrix} x_1\\ x_2\end{bmatrix} = 0_m$$
where $A_1 \in \mathbb R^{m \times n_1}$ and $A_2 \in \mathbb R^{m \times n_2}$.
We would like to minimize $\|Ax\|_2$ subject to the equality constraint $\|x_2\|_2 = 1$. Hence, we have the following quadratically constrained quadratic program (QCQP)
$$\begin{array}{ll} \text{minimize} & \|A_1 x_1 + A_2 x_2\|_2^2\\ \text{subject to} & \|x_2\|_2^2 = 1\end{array}$$
Let the Lagrangian be
$$\mathcal{L} (x_1, x_2, \lambda) := \begin{bmatrix} x_1\\ x_2\end{bmatrix}^T \begin{bmatrix} A_1^T A_1 & A_1^T A_2\\ A_2^T A_1 & A_2^T A_2\end{bmatrix} \begin{bmatrix} x_1\\ x_2\end{bmatrix} - \lambda (x_2^T x_2 - 1)$$
Taking the partial derivatives and finding where they vanish, we obtain three equations
$$A_1^T A_1 x_1 + A_1^T A_2 x_2 = 0_{n_1} \qquad \qquad A_2^T A_1 x_1 + A_2^T A_2 x_2 = \lambda x_2 \qquad \qquad x_2^T x_2 = 1$$
Assuming that $A_1$ has full column rank, then $A_1^T A_1$ is invertible. From the 1st equation, we have
$$x_1 = -(A_1^T A_1)^{-1} A_1^T A_2 x_2$$
and
$$A_1 x_1 = - \underbrace{A_1 (A_1^T A_1)^{-1} A_1^T}_{=:P_1} A_2 x_2 = -P_1 A_2 x_2$$
where $P_1$ is the projection matrix that projects onto the column space of $A_1$. Note that $I_m - P_1$ is the projection matrix that projects onto the left null space of $A_1$, which is orthogonal to the column space of $A_1$. As $I_m - P_1$ is a projection matrix, $(I_m - P_1)^2 = I_m - P_1$, which we will use. Hence,
$$\begin{array}{rl} \|A_1 x_1 + A_2 x_2\|_2^2 &= \|(I_m - P_1) A_2 x_2\|_2^2\\\\ &\geq \sigma_{\min}^2 ((I_m - P_1) A_2) \|x_2\|_2^2\\\\ &= \sigma_{\min}^2 ((I_m - P_1) A_2)\\\\ &= \lambda_{\min} (A_2^T (I_m - P_1)^2 A_2)\\\\ &= \lambda_{\min} (A_2^T (I_m - P_1) A_2)\end{array}$$
where we used the equality constraint $\|x_2\|_2^2 = 1$. Thus, we conclude that
the minimum is attained at the intersection of the eigenspace of the minimum eigenvalue of $A_2^T (I_m - P_1) A_2$ with the unit Euclidean sphere in $\mathbb R^{n_2}$.
$x_2$ is a normalized eigenvector of $A_2^T (I_m - P_1) A_2$. It is also a (normalized) right singular vector of $(I_m - P_1) A_2$. Once we have $x_2$, we obtain $x_1$ via $x_1 = -(A_1^T A_1)^{-1} A_1^T A_2 x_2$.
We can arrive at the same conclusion using a different approach. Using $A_1 x_1 = - P_1 A_2 x_2$, the 2nd equation produced by the vanishing gradient of the Lagrangian then becomes
$$A_2^T (I_m - P_1) A_2 x_2 = \lambda x_2$$
Thus, $(\lambda, x_2)$ is an eigenpair of $A_2^T (I_m - P_1) A_2$. Which eigenpair? The eigenpair corresponding to the minimum eigenvalue of $A_2^T (I_m - P_1) A_2$, and in which the eigenvector has unit $2$-norm.
In MATLAB, use function
inv
to invert matrices and functioneig
to find eigenvalues and eigenvectors. If $A_1$ does not have full column rank, then use functionpinv
to compute the pseudoinverse of $A_1^T A_1$.Lastly, there are several errors in Huang & Barth [PDF]. For example, in equation (7), the Lagrangian function is obviously incorrectly typed, as the $2$-norms are missing.