What's $\frac{\partial}{\partial A}A?$
In this Python App it says that it's $I\otimes I$, but how can it be?
$\frac{\partial}{\partial A}A=\left[\frac{\partial}{\partial A_{ij}}A\right]=[\frac{\partial}{\partial A_{ij}}A_{lk}]$
that equals to 1 only when $l=i$ and $k=j$,i.e.
$\frac{\partial}{\partial A_{ij}}A$ is a matrix of $A$'s dimensions, but only one entry is 1. All the others are zero...
The python app cannot handle higher-order tensors, so it uses vectorization to flatten everything into vectors and proceeds as follows. $$\eqalign{ A &= IAI \cr {\rm vec}(A) &= (I\otimes I)\,{\rm vec}(A) \cr a &= (I\otimes I)\,a \cr da &= (I\otimes I)\,da \cr \frac{\partial a}{\partial a} &= I\otimes I \cr }$$ The tensor gradient is actually very easy to calculate with index notation. $$\eqalign{ \frac{\partial A_{ij}}{\partial A_{kl}} &= \delta_{ik}\delta_{jl} }$$ which simply says that if the index pair $(i,j)$ is equal to the pair $(k,l)$ then the derivative is one, otherwise it's zero.