Vector derivative of inner product involving inverse matrix

49 Views Asked by At

I have a vector $c\in\mathbb{R}^n$ and I define a linear operator ($C=diag(c)$): $L =A_0+\sum_{i=1}^mA_iCB_i$, where $A_i,B_i\in\mathbb{R}^{n\times n}$. I now consider the derivative with respect to $c$:

$$\frac{d}{dc}\langle Lx, y\rangle = \sum_iA^T_iy\odot B_ix,$$

where $\odot$ is pointwise multiplication, and $\langle,\rangle$ is the dot product. Can I show anything regarding: $\frac{d}{dc}\langle L^{-1}x,y\rangle$? I was hoping to be able to use something similar to:

$$0=\frac{d}{d\alpha}(D^{-1}D) = \frac{dD^{-1}}{d\alpha}D+D^{-1}\frac{dD}{d\alpha} \implies \frac{dD^{-1}}{d\alpha} = -D^{-1}\frac{dD}{d\alpha}D^{-1},$$

however I am not sure how to formulate the product rule here:

$$0 = \frac{d}{dc}\langle L^{-1}Lx, y\rangle = \,?$$