How to compute the gradient of
$f(x) := a \oslash (a \odot x + b) $,
with respect to $x \in \mathbb{R}^n$, where $\oslash$ is element-wise division, $\odot$ is element-wise multiplication, and $a , b \in \mathbb{R}^n$.
The gradient should be a matrix, but I am not sure how to approach it.
Let $$\eqalign{ v &= a \odot x + b\\ dv &= a \odot dx }$$
Then, we can find the differential and gradient using the quotient rule: $$\eqalign{ f &= a \oslash v\\ df &= (da \odot v - a \odot dv)\oslash(v\odot v) \\ &= -a \odot (a \odot dx) \oslash (v \odot v) \\ &= (-a\odot a \oslash (v\odot v)) \odot dx \\ &= (-f\odot f) \odot dx\\ &= -\operatorname{Diag}(f \odot f) dx }$$
Thus:
$$ \frac{\partial f}{\partial x} = -\operatorname{Diag}(f \odot f) $$
Here Diag(.) is the operation which transforms a vector in a diagonal matrix. This is the standard procedure for converting a hadamard product in a matrix product. For example:
$$ a \odot b =\operatorname{Diag}(a)b = Ab $$