For my research, I need to calculate a derivative of scalar determinate function w.r.t. matrix with vectors
Here is a scalar defined as $c = \sqrt{det(A)}$
$\mathbf{A}=\mathbf{JJ}^T$
where $\mathbf{J}$ is a m-by-n matrix, n $\ge$ m
also $\mathbf{J}(\vec{\theta})$ is a function of vector $\vec{\theta}\in\Bbb{R}^k$
How can I get the derivative $\frac{\partial{c}}{\partial{\vec{\theta}}}$ ?
So far, I had calculated it base on the Matrix cookbook $$\frac{\partial{c}}{\partial{\vec{\theta}}}= \frac{\partial{c}}{\partial{det(\mathbf{A})}} \frac{\partial{det(\mathbf{A})}}{\partial{\mathbf{A}}} \frac{\partial{\mathbf{A}}}{\partial{\vec{\theta}}}= \frac{1}{\sqrt{det(\mathbf{A})}} det(\mathbf{A})(\mathbf{A}^{-1})^T \frac{\partial{\mathbf{A}}}{\partial{\vec{\theta}}}$$
The key point I can't get:$\frac{\partial{\mathbf{A}}}{\partial{\vec{\theta}}}$ is a tensor of order 3, right?
How can it multiplicate a scalar and matrix before to become a vector $\frac{\partial{c}}{\partial{\vec{\theta}}}$ ?
I also tried to vectorize $\mathbf{A}$, but there is another question:how to multiplicate a matrix before and the vecterized $\frac{\partial{\mathbf{A}}}{\partial{\vec{\theta}}}$?
Please help me to fiqure out these questions.
Many appreciate any answer and suggestion.
$ \def\a{\alpha}\def\t{\theta} \def\o{{\tt1}}\def\p{\partial}\def\j{\jmath} \def\L{\left}\def\R{\right} \def\LR#1{\L(#1\R)} \def\vecc#1{\operatorname{vec}\LR{#1}} \def\sym#1{\operatorname{sym}\LR{#1}} \def\trace#1{\operatorname{Tr}\LR{#1}} \def\qiq{\quad\implies\quad} \def\grad#1#2{\frac{\p #1}{\p #2}} \def\c#1{\color{red}{#1}} $First, let's use a convention in which an uppercase letter denotes a matrix, a lowercase letter a vector, and a Greek letter a scalar. So rename the problem variables to $\;\{c,\t\} \to \{\a,x\}$
Second, let's introduce some new variables $$\eqalign{ j &= \vecc{J} \\ B &= \a A^{-1} J &\qiq \;\;\;b\; &= \vecc{B} = \a\LR{I_n\otimes A^{-1}} j \\ M &= \grad{j}{x} &\qiq \:dj &= M\,dx \\ A &= JJ^T &\qiq dA &= \LR{dJ\,J^T+J\,dJ^T} \\ }$$ Finally, the Frobenius product is a convenient notation for the trace $$\eqalign{ A:G &= \sum_{i=1}^m\sum_{k=1}^n A_{ik}G_{ik} \;=\; \trace{A^TG} \\ A:A &= \big\|A\big\|^2_F \\ }$$ The properties of the underlying trace function allow the terms in such a product to be rearranged in many different but equivalent ways, e.g. $$\eqalign{ A:G &= G:A \\ A:G &= A^T:G^T \\ H:\LR{AG} &= \LR{HG^T}:A = \LR{A^TH}:G \\ }$$ Square the function and calculate its differential, then perform a change of variables from $A\to J\to \j\to x$, then recover the desired gradient. $$\eqalign{ \a &= {\det(A)}^{1/2} \\ \a^2 &= \det(A) \\ 2\a\;d\a &= d\det(A) \\ &= \LR{\a^2A^{-T}}:dA \\ d\a &= \frac 12\LR{\a A^{-\o}}:\LR{dJ\,J^T+J\,dJ^T} \\ &= B:dJ \\ &= b:dj \\ &= b:\LR{M\,dx} \\ &= \LR{M^Tb}:dx \\ \grad{\a}{x} &= {M^Tb} \\ &= \det(A)^{1/2} \,\LR{\grad{\vecc{J}}{x}}^T \LR{I_n\otimes A^{-1}} \,\vecc{J} \\ }$$