Prove Derivative is sum of determinants

1.6k Views Asked by At

Given $n^2$ functions $f_{ij}$, each differentiable on an interval (a,b), define $F(x) = det[f_{ij}(x)]$ for each $x$ in $(a,b)$. Prove that the derivative $F'(x)$ is the sum of the determinants, $$ F'(x) = \sum_{i=1}^n det A_i(x),$$ where $A_i(x)$ is the matrix obtained by differentiating the functions in the $ith$ row of $[f_{ij}(x)]$

Yeah, I have no clue what I'm supposed to do.

2

There are 2 best solutions below

0
On

From Differentiating the determinant of the Jacobian of a diffeomorphism (don't understand a proof) we get $$ \frac{d}{dt} \det(G(t)) = \det(G(t)) \text{trace}[G(t)^{-1} G'(t)] .$$ where $G(t) = [f_{ij}(t)]$ and $t = x$.

Now $\det(G(t)) G(t)^{-1} = H(t)^T$ where $H(t)$ is the matrix of cofactors of $G(t)$: http://en.wikipedia.org/wiki/Adjugate_matrix. So $$ \frac{d}{dt} \det(G(t)) = \sum_{i=1}^n \sum_{j=1}^n H_{ij}(t) G'_{ij}(t) .$$ But by standard methods for computing the determinant http://en.wikipedia.org/wiki/Determinant#Laplace.27s_formula_and_the_adjugate_matrix, $\sum_{i=1}^n H_{ij}(t) G'_{ij}(t) = \det(A_j)$.

This proof works if $G(t)$ is invertible. If $G(t)$ is not invertible, I think you can follow the second proof I have given here: Metric on an open subset of $\mathbb{R}^d$ and Christoffel symbol of the second kind

0
On

$$\begin{align} {F'(x)} & = \frac{d\, \det\,[f_{i,j}(x)] }{d\,x} & \text{ apply Leibniz formula for determinant }\\ & = \dfrac{d\, \sum_{\sigma \in S^n} ({\rm sgn~}\sigma) \prod_{k=1}^{n} f_{k,\sigma _k}(x) }{d\,x}\\ & = \sum_{\sigma \in S^n} ({\rm sgn~}\sigma)\frac{d\, \prod_{k=1}^{n} f_{k,\sigma _k}(x) }{d\,x} & \text { apply derivative of product rule } \\ & = \sum_{\sigma \in S^n} ({\rm sgn~}\sigma)\sum_{i=1}^{n}\left(\prod_{k=1}^{n} \begin{cases} f_{k,\sigma _k}(x) \text{ if } i \ne k\\ f'_{k,\sigma _k}(x) \text{ if } i = k\\ \end{cases}\right)\\ & = \sum_{i=1}^{n}\sum_{\sigma \in S^n} ({\rm sgn~}\sigma)\left(\prod_{k=1}^{n} \begin{cases} f_{k,\sigma _k}(x) \text{ if } i \ne k\\ f'_{k,\sigma _k}(x) \text{ if } i = k\\ \end{cases}\right)\\ & = \sum_{i=1}^{n}\det\left[\begin{cases} f_{k,j}(x) \text{ if } i \ne k\\ f'_{k,j}(x) \text{ if } i = k\\ \end{cases}\right] \\ & = \sum_{i=1}^{n}\det\, A_i \end{align}$$