Derivative of distance function between SPD matrix

42 Views Asked by At

I have two symmetric and positive-definite (SPD) matrix $\mathbf{A}, \mathbf{B} \in \mathbb{R}^{N\times N}$. $\mathbf{A}$ is known and $\mathbf{B}$ is a function of $\mathbf{x} \in \mathbb{R}^N$. Define the distance of $\mathbf{A}$ and $\mathbf{B}$ as follow:

$D(\mathbf{A}, \mathbf{B}) = \lVert ln (\mathbf{A}^{-1} \mathbf{B})\rVert_{F}^2 = \sum_n [ln(w_n)]^2$

with $\lVert \cdot \rVert$ the Frobenius norm and $w_1, \cdots, w_n$ the eigenvalues of $\mathbf{A}^{-1} \mathbf{B}$.

Could we have the gradient of function $D(\mathbf{A}, \mathbf{B})$ with respect to $\mathbf{x}$ if we know the expression of $\frac{\partial \mathbf{B}}{\partial \mathbf{x}}$ (comment: for the n-th element of $\mathbf{x}$, $\frac{\partial \mathbf{B}}{\partial x_n}$ should be a matrix in space $\mathbb{R}^{N\times N}$)?


How could the chain rule be applied in this problem? For example, if I know the expression of $\frac{\partial D}{\partial \mathbf{B}} \in \mathbb{R}^{N\times N}$ and $\frac{\partial \mathbf{B}}{\partial x_n} \in \mathbb{R}^{N\times N}$, could I obtain the expression of $\frac{\partial D}{\partial x_n} \in \mathbb{R}$?