I am interested in minimizing the following Lagrangian function:
\begin{align*} \frac{1}{2}\log\det (PWP^\top) - \operatorname{trace}(\Lambda^\top (PS - I)), \end{align*} where $P$ is an $m \times n$ matrix (unknown; rank is $m$), $W$ is positive definite (known), $S$ is an $n \times m$ matrix (known; rank is $m$) and $\Lambda$ is the Lagrange multiplier matrix.
I was able to obtain a solution. In order to make sure the solution corresponds to a minimum, I think we should look at the Hessian. The Hessian, in this case, is independent of the Lagrange multiplier. So, is it enough to show that the second derivative of the objective function is positive definite, similar to unconstrained optimization?
Edited:
It seems like I have to work with "Bordered Hessian" matrix. Does anyone know how many principal minors I should evaluate in here and any criteria for checking whether the critical point I got is a minimum?