The gradient and hessian of log sum of an inverse of the vector

486 Views Asked by At

What will be the Gradient and Hessian of the log sum of an inverse of the vector,?

For example, if $x$ is a vector, then what will be the gradient and hessian w.r.t $x$ for: $$ \log\sum_i(\dfrac{c_i}{x_i}) $$ where $c$ is a vector of constants.

1

There are 1 best solutions below

0
On BEST ANSWER

Let's use a colon (:) to denote the trace/Frobenius product, i.e. $A:B={\rm tr}(A^TB)$
and $\odot$ to denote elementwise/Hadamard multiplication
and $\oslash$ to denote elementwise/Hadamard division.

Define a new scalar variable and its differential $$\eqalign{ \sigma &= 1:c\oslash x \cr &= (1\oslash x):c \cr\cr d\sigma &= -dx\oslash (x\odot x):c \cr &=-c\oslash (x\odot x):dx }$$ Then the function, differential, and gradient are $$\eqalign{ \lambda &= \log(\sigma) \implies \sigma = \exp(\lambda) \cr \cr d\lambda &= \frac{d\sigma}{\sigma} = -\frac{1}{\sigma}\,c\oslash (x\odot x):dx \cr \cr \frac{\partial \lambda}{\partial x} &= g = -\frac{1}{\sigma}\,c\oslash (x\odot x) = -e^{-\lambda}\,c\oslash (x\odot x)\cr }$$ If we let $$G={\rm Diag}(g),\,\,X={\rm Diag}(x)$$ then we can write the Hessian as $$H=\frac{\partial g}{\partial x}=-(2GX^{-1}+gg^T)$$