How to prove or explain some matrix identities.

47 Views Asked by At

I'm writing a masters dissertation (and, therefore, have to prove or cite a reference for everything I do) in which I'm using some matrix identities that I don't know how to explain that they hold in a friendly way nor a reference that does that. These identities are not central to my work and I prefer a reference that explains/prove them, but I must write these identities in the text. So, any help with this would be very appreciated and now let me give two examples of what I'm talking about.

Consider any $C^2$ function $f:Mat(m \times n) \to \mathbf{R}$. I have to compute the Hessian of this function and it has the following format: $$\frac{\partial^2{f}}{\partial{X_{ij}}\partial{X_{kl}}} = \text{vec}(E_{ij})^{\top}(A \otimes B)\text{vec}(E_{kl}),$$ where $\otimes$ is the Kroenecker product, $A \in Mat(m \times m)$ and $B \in Mat(n \times n)$ are matrices that are not important for the question, vec is the operator that vectorizes a matrix columnwise, ie, stacking one column after the other obtaining a column vector at the end, and $E_{pq} \in Mat(m \times n)$ is the matrix whose $(p,q)$ entry is $1$ and the others are $0$. Now, the actual question: I need to use the fact that when I run over the indices $i, j, k, l$, in that order, I'll obtain the following matrix representations for the Hessian: $$\text{Hess}(f) = A \otimes B.$$

Another example is the following: given the expression $$C_{ij} := (e_j \otimes C)e_i,$$ where $e_i \in \mathbf{R}^n$, $e_j \in \mathbf{R}^m$ and $C \in Mat(d \times n)$, I want to show that when I run first over $i$ and then over $j$ stacking the vectors $C_{ij}$ horizontally, ie, $[C_{11} \ C_{21} \ ... \ C_{nm}] \in Mat(md \times mn)$, then the final result is $\ \text{Id}_m \otimes C$.

Does anyone know a reference that proves these identities or how I could explain this procedure of running over these indices in a friendly way? Thanks.