We know that, for scalars, $(fg)'=f'g+fg'$. Does the same hold for vectors wrt matrices, i.e., if $f(A)$ is a vector who depends on $A$, does
$$ \frac{\partial Af(A)}{\partial A} = \frac{\partial A}{\partial A}f(A)+A^T\frac{\partial f(A)}{\partial A} $$
hold? If not, how do I compute $\frac{\partial Af(A)}{\partial A}$?
The rule you are seeking does exist for derivatives (with respect to a scalar) and for differentials. $$\eqalign{ \frac{d\,(f\odot g)}{dx} &= \frac{df}{dx}\odot g &+ f\odot\frac{dg}{dx} \\ d\,(f\odot g) &= df\odot g &+ f\odot dg \\ }$$ where $(\odot)$ represents any product (Hadamard, Kronecker, dot, double-dot, dyadic, tensor, etc) and $(f,g)$ can be any scalar, vector, matrix, or tensor which is compatible with that product and $(x)$ is a scalar variable. You can think of the differential as the derivative with respect to a scalar where the scalar variable has not yet been specified. One of its virtues is that it saves a lot of formatting.
For your specific problem, since you told us nothing about the function $f$, I'll assume that you already know how to calculate its gradient, i.e. $${\cal J}=\frac{\partial f}{\partial A} \quad\implies df = {\cal J}:dA$$ Define a new vector function $\,y=Af\;$ and calculate its differential and gradient. $$\eqalign{ dy &= dA\cdot f + A\cdot df \\ &= dA\cdot f + A\cdot{\cal J}:dA \\ &= ({\cal H}\cdot f + A\cdot{\cal J}):dA \\ \frac{\partial y}{\partial A} &= {\cal H}\cdot f + A\cdot{\cal J} \\ }$$ In the above, the symbol $(\cdot)$ represents the dot-product and $(:)$ the double-dot product.
The 4th order tensor ${\cal H}$ is the dyadic product of identity matrices: $\,{\cal H}=I\star I$
which can be written in component form using Kronecker delta symbols: $\,{\cal H}_{ijkl} = \delta_{ij}\,\delta_{kl}$
Note that these definitions mean that $\;{\cal H}\cdot f = I\star f$
The 3rd order tensor ${\cal J}$ is left to you to calculate component-wise: $\;{\cal J}_{ijk} = \frac{\partial f_i}{\partial A_{jk}}$