product rule for matrix functions

51 Views Asked by At

if A,B are matrices, does the product rule hold like i've written it? (It holds if $\vec{x}$ was a scalar. ) $$\frac{d A(\vec{x})B(\vec{x})}{d\vec{x}} = \frac{d A(\vec{x})}{d\vec{x}} B(\vec{x}) + A(\vec{x})\frac{d B(\vec{x})}{d\vec{x}} $$

I do not know what the derivative $\frac{d A(\vec{x})}{d\vec{x}}$ is, so it is difficult to verify the above statement. But in my case, if the above identity holds, then there are eliminations and it is not required to calculate it.

Extra: Are there some sources which present the above in an not so formal manner? (I am an engineer)

1

There are 1 best solutions below

0
On BEST ANSWER

Unfortunately, the proposed rule is false, but you can use index notation to derive the correct rule $$\eqalign{ \def\LR#1{\left(#1\right)} \def\p{\partial} \def\grad#1#2{\frac{d #1}{d #2}} \def\gradLR#1#2{\LR{\grad{#1}{#2}}} \def\op#1{\operatorname{#1}} \def\vecc#1{\op{vec}\LR{#1}} P &= AB \\ P_{ij} &= \sum_n A_{in}B_{nj} \\ \grad{P_{ij}}{x_k} &= \sum_n \gradLR{A_{in}}{x_k} B_{n j} + A_{in} \gradLR{B_{nj}}{x_k} \\ }$$ This result is a third-order tensor which cannot be written using standard matrix notation. The first term on the RHS is especially problematic.

One trick is to vectorize the equation before differentiating $$\eqalign{ p &= \vecc{P} \\ &= \LR{B\otimes I}^Ta &\doteq\; \LR{I\otimes A}b \\ \grad px &= \LR{B\otimes I}^T\gradLR ax \;&+\; \LR{I\otimes A}\gradLR bx \\ }$$ Vectorization flattens matrices into vectors, so matrix notation will work again.

To learn more about this subject, explore the matrix-calculus tag on this site.
The Matrix Cookbook is also a handy reference.