Directional derivative in orthogonal directions of a determinant

83 Views Asked by At

I am trying to calculate the gradient $\nabla_\theta f \big|_{\theta_0}$ of the following function $$f(\theta) = \frac{1}{2} \text{log} \det \left( Q^T(\theta) J^T(\theta) J(\theta) Q(\theta) \right)$$ where $$ f:\mathbb{R}^d \to \mathbb{R};~~~~~Q:\mathbb{R}^d \to \mathbb{R}^{d \times (d-t)} (!!!);~~~~~J:\mathbb{R}^d \to \mathbb{R}^{d \times d}, $$ and $t < d$. One important detail that I am trying to leverage is that $Q(\theta)$ is the compact SVD of a projection matrix $P(\theta)$ such that the columns of $Q$ are the orthonormal eigenvectors of $P(\theta)$ associated with the non-zero eigenvalues.

I would very much like this calculation to sidestep a direct differential of $Q(\theta)$, since I do not think that the "function" (!!!) above is even a function since the compact SVD is non-unique. In that direction, I am considering the directional derivatives

$$ \nabla_\theta f = \begin{pmatrix} D_{q_1} f & \cdots & D_{q_d} f \end{pmatrix} \begin{pmatrix} Q^T(\theta_0) \\\ Q_\perp^T (\theta_0) \end{pmatrix}, $$

where $q_i \in \begin{pmatrix} Q(\theta_0) & Q_\perp (\theta_0) \end{pmatrix}$ are the columns of the "completed" $d \times d$ orthogonal matrix. My intuition is that $D_{q_i} f = 0$ whenever $q_i \in Q_\perp(\theta_0)$ (please excuse the abuse of the $\in$ notation). That is, going in the direction orthogonal to all columns in $Q(\theta_0)$ should yield no change in $f$.

Any thoughts or suggested readings would be appreciated. I've tried a lot of (incorrect) ways to prove my conjecture.