Question about analogies between vector calculus and matrix calculus

85 Views Asked by At

If working in $\mathbb{R}^N$ I have a set of $N$ functions $u_i = (x_1,x_2,...,x_N)$ such that the correspondence between $(x_1,x_2,...,x_N)$ and $(u_1,u_2,...,u_N)$ is unique, then I can build two interesting set of vector basis ($\mathbf{e}_i = \frac{\partial \mathbf{r}}{\partial u_i}$ and $\mathbf{e}^i = \nabla u_i$) which have the property of being reciprocal ($\mathbf{e}_i \cdot \mathbf{e}^i = \delta_{ij}$) and we have that any vector $\mathbf{A}$ can be written as $\mathbf{A}=A^i \mathbf{e}_i$ or $\mathbf{A}=A_i \mathbf{e}^i$ where $A^i = \mathbf{A} \cdot \mathbf{e}^i$ and $A_i = \mathbf{A} \cdot \mathbf{e}_i$, and we can show other things.

Question: does a similar couple of "natural" basis (in some way linked to the change of coordinates) exist for matrices too? Each basis should contain $N^2$ matrices base, and each generic matrix should be doable in only one way by a sum $A_{ij} \mathsf{B}_{ij}$ where $A_{ij}$ are $N^2$ scalars ($i,j=1,2,...,N$) and $\mathsf{B}_{ij}$ are corresponding matrices. Maybe it should be better simply write $A_{i} \mathsf{B}_{i}$ ($i=1,2,...,N^2$)? Analogy with vectors are however problematic because of the absence of a dot products between matrices (how can I find coordinates of a generic matrix in some basis and what should be the meaning of "reciprocal basis" in this context?). I suppose (but I'm not sure, maybe this question is meaningless) it exists a natural way to develop these thinking but I don't know it. Can you explain something to me and address me on some site or book? (or even discourage me if this analogy is a dead track, your intervention would be equally useful).