I am working on an optimization problem where I have to find the derivative of $$\langle f(\alpha) , Wf(\alpha) \rangle$$ with respect to $\alpha$. Here $\langle \cdot , \cdot \rangle$ denotes the Frobenius inner product, $W$ is a constant matrix, and $f$ a function of a matrix whose output is also a matrix.
If $W$ is an indentity matrix, I can finds its derivative, but I am not able to find the derivative when $W$ is not an identity matrix.
For consistency, let's use uppercase Latin for matrices, and lowercase Greek for scalars. Let's also use colons (:) to represent the Frobenius product.
Given the gradient of $F$ with respect to $\alpha$, i.e. $$G = \frac{\partial F}{\partial\alpha} \implies dF=G\,d\alpha$$ we can calculate the differential and gradient for the function of interest as $$\eqalign{ \phi &= F:WF = W:FF^T \cr d\phi &= W:(dF\,F^T+F\,dF^T) = (W+W^T):dF\,F^T = (W+W^T)F:G\,d\alpha \cr \frac{\partial\phi}{\partial\alpha} &= (W+W^T)F:G \cr }$$