Gradient of $f(x) = a \oslash (b + a \odot x)$ w.r.t. $x \in \mathbb{R}^n$

142 Views Asked by At

How to compute the gradient of

$f(x) := a \oslash (a \odot x + b) $,

with respect to $x \in \mathbb{R}^n$, where $\oslash$ is element-wise division, $\odot$ is element-wise multiplication, and $a , b \in \mathbb{R}^n$.


The gradient should be a matrix, but I am not sure how to approach it.

1

There are 1 best solutions below

0
On BEST ANSWER

Let $$\eqalign{ v &= a \odot x + b\\ dv &= a \odot dx }$$

Then, we can find the differential and gradient using the quotient rule: $$\eqalign{ f &= a \oslash v\\ df &= (da \odot v - a \odot dv)\oslash(v\odot v) \\ &= -a \odot (a \odot dx) \oslash (v \odot v) \\ &= (-a\odot a \oslash (v\odot v)) \odot dx \\ &= (-f\odot f) \odot dx\\ &= -\operatorname{Diag}(f \odot f) dx }$$

Thus:

$$ \frac{\partial f}{\partial x} = -\operatorname{Diag}(f \odot f) $$

Here Diag(.) is the operation which transforms a vector in a diagonal matrix. This is the standard procedure for converting a hadamard product in a matrix product. For example:

$$ a \odot b =\operatorname{Diag}(a)b = Ab $$