Given an input matrix,
$$ X = \begin{bmatrix} x_{00} & x_{01} & x_{02} & x_{03} \\ x_{10} & x_{11} & x_{12} & x_{13} \\ x_{20} & x_{21} & x_{22} & x_{23} \\ x_{30} & x_{31} & x_{32} & x_{33} \end{bmatrix} $$
a kernel
$$ W = \begin{bmatrix} w_{00} & w_{01} \\ w_{10} & w_{11} \end{bmatrix} $$
and the result of cross-correlation
$$ \begin{align} Z &= X \star W \\\\ Z &= \begin{bmatrix} z_{00} & z_{01} & z_{02} \\ z_{10} & z_{11} & z_{12} \\ z_{20} & z_{21} & z_{22} \end{bmatrix} \end{align} $$
$$ \begin{align} z_{00} &= x_{00} w_{00} + x_{01} w_{01} + x_{10} w_{10} + x_{11} w_{11} \\ z_{01} &= x_{01} w_{00} + x_{02} w_{01} + x_{11} w_{10} + x_{12} w_{11} \\ z_{02} &= x_{02} w_{00} + x_{03} w_{01} + x_{12} w_{10} + x_{13} w_{11} \\ z_{10} &= x_{10} w_{00} + x_{11} w_{01} + x_{20} w_{10} + x_{21} w_{11} \\ z_{11} &= x_{11} w_{00} + x_{12} w_{01} + x_{21} w_{10} + x_{22} w_{11} \\ z_{12} &= x_{12} w_{00} + x_{13} w_{01} + x_{22} w_{10} + x_{23} w_{11} \\ z_{20} &= x_{20} w_{00} + x_{21} w_{01} + x_{30} w_{10} + x_{31} w_{11} \\ z_{21} &= x_{21} w_{00} + x_{22} w_{01} + x_{31} w_{10} + x_{32} w_{11} \\ z_{22} &= x_{22} w_{00} + x_{23} w_{01} + x_{32} w_{10} + x_{33} w_{11} \end{align} $$
I need to get the derivative of a loss function $l : \mathbb{R}^{c} \mapsto \mathbb{R}$ with respect to the kernel $W$.
I already have $\frac{\partial l}{\partial Z}$ and $\frac{\partial Z}{\partial W_{ij}}$,
$$ \frac{\partial l}{\partial Z} = \begin{bmatrix} \frac{\partial l}{\partial z_{00}} & \frac{\partial l}{\partial z_{01}} & \frac{\partial l}{\partial z_{02}} \\ \frac{\partial l}{\partial z_{10}} & \frac{\partial l}{\partial z_{11}} & \frac{\partial l}{\partial z_{12}} \\ \frac{\partial l}{\partial z_{20}} & \frac{\partial l}{\partial z_{21}} & \frac{\partial l}{\partial z_{22}} \end{bmatrix} $$
$$ \newcommand{\grad}[2]{\frac{\partial #1}{\partial #2}} \begin{array}{cc} \grad{Z}{w_{00}} = \begin{bmatrix} \grad{z_{00}}{w_{00}} & \grad{z_{01}}{w_{00}} & \grad{z_{02}}{w_{00}} \\ \grad{z_{10}}{w_{00}} & \grad{z_{11}}{w_{00}} & \grad{z_{12}}{w_{00}} \\ \grad{z_{20}}{w_{00}} & \grad{z_{21}}{w_{00}} & \grad{z_{22}}{w_{00}} \end{bmatrix} = \begin{bmatrix} x_{00} & x_{01} & x_{02} \\ x_{10} & x_{11} & x_{12} \\ x_{20} & x_{21} & x_{22} \end{bmatrix} & \grad{Z}{w_{01}} = \begin{bmatrix} \grad{z_{00}}{w_{01}} & \grad{z_{01}}{w_{01}} & \grad{z_{02}}{w_{01}} \\ \grad{z_{10}}{w_{01}} & \grad{z_{11}}{w_{01}} & \grad{z_{12}}{w_{01}} \\ \grad{z_{20}}{w_{01}} & \grad{z_{21}}{w_{01}} & \grad{z_{22}}{w_{01}} \end{bmatrix} = \begin{bmatrix} x_{01} & x_{02} & x_{03} \\ x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \end{bmatrix} \\ \grad{Z}{w_{10}} = \begin{bmatrix} \grad{z_{00}}{w_{10}} & \grad{z_{01}}{w_{10}} & \grad{z_{02}}{w_{10}} \\ \grad{z_{10}}{w_{10}} & \grad{z_{11}}{w_{10}} & \grad{z_{12}}{w_{10}} \\ \grad{z_{20}}{w_{10}} & \grad{z_{21}}{w_{10}} & \grad{z_{22}}{w_{10}} \end{bmatrix} = \begin{bmatrix} x_{10} & x_{11} & x_{12} \\ x_{20} & x_{21} & x_{22} \\ x_{30} & x_{31} & x_{32} \end{bmatrix} & \grad{Z}{w_{11}} = \begin{bmatrix} \grad{z_{00}}{w_{11}} & \grad{z_{01}}{w_{11}} & \grad{z_{02}}{w_{11}} \\ \grad{z_{10}}{w_{11}} & \grad{z_{11}}{w_{11}} & \grad{z_{12}}{w_{11}} \\ \grad{z_{20}}{w_{11}} & \grad{z_{21}}{w_{11}} & \grad{z_{22}}{w_{11}} \end{bmatrix} = \begin{bmatrix} x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} \end{array} $$
but I have no idea what to do next to get $\frac{\partial l}{\partial W} = \frac{\partial l}{\partial Z} \frac{\partial Z}{\partial W}$.
Thanks for J.G.'s comment, I was able to derive a solution, even though I don't know if I'm using a correct notation, but the math checks out.
We can use Einstein notation to express cross-correlation
$$ \newcommand{\grad}[2]{\frac{\partial #1}{\partial #2}} Z_{[i,j]} = X_{[i+u,j+v]} W_{[u,v]} $$
and the derivative in index notation
$$ \begin{align} \grad{l}{W_{[s,t]}} &= \grad{l}{Z_{[i,j]}} \grad{Z_{[i,j]}}{W_{[s,t]}} \\ &= \grad{l}{Z_{[i,j]}} X_{[i+u,j+v]} \grad{W_{[u,v]}}{W_{[s,t]}} \\ &= \grad{l}{Z_{[i,j]}} X_{[i+u,j+v]} \delta_{[u,s]} \delta_{[v,t]} \\ &= \grad{l}{Z_{[i,j]}} X_{[i+s,j+t]} \\ &= X_{[s+i,t+j]} \grad{l}{Z_{[i,j]}} \end{align} $$
back to symbolic notation
$$ \grad{l}{W} = X \star \grad{l}{Z} $$
Now if anyone would need $\grad{l}{X}$, here's a derivation
$$ \begin{align} \grad{l}{X_{[s,t]}} &= \grad{l}{Z_{[i,j]}} \grad{Z_{[i,j]}}{X_{[s,t]}} \\ &= \grad{l}{Z_{[i,j]}} \grad{ X_{[i+u,j+v]}}{X_{[s,t]}} W_{[u,v]} \\ &= \grad{l}{Z_{[i,j]}} \delta_{[i+u,s]} \delta_{[j+v,t]} W_{[u,v]} \end{align} $$
$\delta_{[i+u,s]}$ is only 1 when $_{s = i + u}$ or $_{u = s - i}$ and $\delta_{[j+v,t]}$ when $_{t = j + v}$ or $_{v = t - j}$, so
$$ \begin{align} &= \grad{l}{Z_{[i,j]}} \delta_{[u,s-i]} \delta_{[v,t-j]} W_{[u,v]} \\ &= \grad{l}{Z_{[i,j]}} W_{[s-i,t-j]} \\ \grad{l}{X} &= \grad{l}{Z} \underset{\text{full}}{*} W \end{align} $$
where $\underset{\text{full}}{*}$ is full convolution.