I want to do this matrix calculus: Given, a distance matrix of squared Euclidean distances $D(X)_{n \times n}$ of $n$ points $X \in \mathbb{R}^k$, and given $C_n$, a centering matrix as defined in https://en.wikipedia.org/wiki/Centering_matrix, I want to find derivative of $$-C_n D(X) C_n$$ wrt $X$. Eqn 1049 or 1050 of chapter 5 'Euclidean Distance Matrix' of this wonderful book here: https://ccrma.stanford.edu/~dattorro/EDM.pdf which gives a simplification of $D(X)$ could be useful.
My guess is the gradient would be $-C_n(\cdot)$ where the placeholder $(\cdot)$ would be the r.h.s of eqn 1049. I could be wrong. Nevertheless, I was looking for simpler representations of this derivative so that its wieldy to use in computations and as part of a larger result or analysis.
$-0.5*C_n D(X) C_n$ occurs in technique of 'classical multidimensional scaling' and is related to Gramian matrix representation as can be seen in here as well Calculating Gramian matrix from Euclidean distance matrix
The Gram and Distance matrices of the data matrix $X$ are $$\eqalign{ G &= X^TX,\qquad g={\rm diag}(G) \\ D &= g{\tt1}^T + {\tt1}g^T - 2G \\ }$$ The centering matrix is $$\eqalign{ C &= I - \frac{{\tt11}^T}{n},\qquad C{\tt1}=0,\quad{\tt1}^TC=0^T \\ }$$ Write the function of interest and calculate its differential. $$\eqalign{ F &= -CDC \;=\; 2CGC \\ dF &= 2C\,dG\,C \\ &= 2CX^TdX\,C + 2C\,dX^TXC \\ }$$ Then use vectorization to flatten the matrices and obtain the gradient as a matrix. $$\eqalign{ df &= \Big((2C\otimes CX^T) + (2CX^T\otimes C)K\big)\,dx \\ \frac{\partial f}{\partial x} &= 2\left(C\otimes CX^T\right) + 2\left(CX^T\otimes C\right)K \\ }$$ where $K$ is the commutation matrix associated with the Kronecker product.
Without vectorization, the gradient is a matrix-by-matrix derivative (a fourth order tensor).
I can provide an expression for that, but I doubt you'd find it useful.
Let $Y=XC,\,$ then the tensor gradient expression is $$\frac{\partial F_{ij}}{\partial X_{kl}}=2C_{il}Y_{kj}+2Y_{ki}C_{lj}$$
Update
Perhaps this a better way of writing the tensor. $$\eqalign{ H &= C\star Y \qquad&\big({\rm Dyadic\,Product}\big) \\ H_{ijkl} &= C_{ij}Y_{kl}& \\ \frac{\partial F_{ij}}{\partial X_{kl}} &= 2\big(H_{ilkj} + H_{jlki}\big)\quad& \\ }$$