I'm trying to understand the formulas that are in "The Matrix Cookbook" but I'm having a hard time understanding their intended meanings. For example, what does $\partial A$ mean exactly? How is $\partial$ defined as an operator? Does it mean that $A$ is thought of as a matrix with its entries changing with respect to some variable $t$ or is there another interpretation?
And here's an almost philosophical question: What good does differentiating matrices do exactly? I would be very happy to know a few examples of the kind of results about matrices that can be proven by differentiating.
Meanwhile, is this in anyway related to Frechet's definition of a derivative? Here's the definition I'm referring to:
Let's say $(X,\|\cdot\|_X)$ and $(Y,\|\cdot\|_Y)$ are two Banach spaces and $f: X\to Y$ is a function. We define $D_pf$ to be a function in $\mathcal{L}(X,Y)$ such that
$$\lim_{h\to 0}\frac{\|f(p+h)-f(p)-D_pf(h)\|_Y}{\|h\|_X}=0$$
Are these notions related? If not, how do we know when to use which?
From what I can tell, the following conventions are used
The cookbook never puts matrices on both the top and bottom, preferring to use index notation when that happens.
As for why we would want to, well, why wouldn't we want to? Matrices can describe vectors and (rank 2) tensors, and vector and tensor calculus certainly have no end of uses.
And no, the Frechet derivative is not relevant here.