I was looking to write the KKT conditions to solve this optimisation problem.
$$\min_{\substack{\sum_j x_{ij}\le k_i \\ i=1,2,\ldots N}} a^\top (I-X)^{-1} b $$
Since there are $N^2$ decision variables in this problem, I assume there will be so many Lagrange multipliers, which can be represented as an $N\times N$ matrix. But I am clueless as to how to differentiate $a^\top (I-X)^{-1} b$ with respect to $x_{ij}$. How do we write the KKT conditions and solve problems of this sort?
The constraints aren't sufficient to imply that $(I-X)$ is nonsingular. So the objective isn't continuous on the feasible set, let alone convex. (Maybe there is supposed to be an additional constraint such as being negative semidefinite?) But if you want to form the Lagrangian and differentiate it would be this:
$$ L(X,\lambda) = a^T (I-X)^{-1}b + \lambda^T( X {1} - k) $$
$$ \frac{\partial L}{\partial X} = (I-X)^{-T} a b^T (I-X)^{-T} + \lambda 1^T $$
I got the formula for the matrix derivatives from the "Matrix Cookbook".