I have a matrix $A$ which is of size $m \times n$, a vector $B$ which of size $n \times 1$ and a vector $c$ which of size $m \times 1$. I'd like to take the derivative of the following function w.r.t to $A$:
$f(A) = \lVert A \times B - c\rVert_2^2$
Notice that this is a $l_2$ norm not a matrix norm, since $A \times B$ is $m \times 1$. I am using this in an optimization problem where I need to find the optimal $A$.
Let $f:A\in M_{m,n}\rightarrow f(A)=(AB-c)^T(AB-c)\in \mathbb{R}$ ; then its derivative is
$Df_A:H\in M_{m,n}(\mathbb{R})\rightarrow 2(AB-c)^THB$.
If you want its gradient:
$Df_A(H)=trace(2B(AB-c)^TH)$ and $\nabla(f)_A=2(AB-c)B^T$.
EDIT 1. Some details for @ Gigili. Let $Z$ be open in $\mathbb{R}^n$ and $g:U\in Z\rightarrow g(U)\in\mathbb{R}^m$.
Here $Df_A(H)=(HB)^T(AB-c)+(AB-c)^THB=2(AB-c)^THB$ (we are in $\mathbb{R}$).
Thus $Df_A(H)=tr(2B(AB-c)^TH)=tr((2(AB-c)B^T)^TH)=<2(AB-c)B^T,H>$ and $\nabla(f)_A=2(AB-c)B^T$.
EDIT 2. @ user79950 , it seems to me that you want to calculate $\inf_A f(A)$; if yes, then to calculate the derivative is useless. Indeed, if $B=0$, then $f(A)$ is a constant; if $B\not= 0$, then always, there is $A_0$ s.t. $A_0B=c$ and the inferior bound is $0$.