Assume $S_1$ and $S_2$ are two $n \times n$ (positive definite if that helps) matrices, $c_1$ and $c_2$ are two variables taking scalar form, and $u_1$ and $u_2$ are two $n \times 1$ vectors. In addition, $c_1+c_2=1$, but in the more general case of $m$ $S$'s, $u$'s, and $c$'s, the $c$'s also sum to 1.
What is the derivative of $(c_1 S_1+c_2 S_2)^{-1}(c_1 u_1+c_2 u_2)$ with respect to both $c_1$ and $c_2$?
The condition that $S_1$ and $S_2$ are positive definite is relevant to the existence of the inverse in the definition of the function. I assume that it is taken as given that the inverse exists at the relevant values of $c_1$ and $c_2$. This would be true in particular if $c_1$ and $c_2$ were positive.
By symmetry, the same method will apply for $c_1$ and $c_2$, and we're basically differentiating the function $f(t)=(tA+B)^{-1}(tu+v)$, where $A$ and $B$ are matrices and $u$ and $v$ are column vectors. You can write this as $f(t)=g(t)h(t)$, where $g$ is a matrix valued function and $h$ is a vector valued function. By the product rule, $f'(t)=g'(t)h(t)+g(t)h'(t)$. So you just need to be able to determine $g'$ and $h'$. It is straightforward that $h'(t)=u$. You can also show that $g'(t)=-(tA+B)^{-1}A(tA+B)^{-1}$.