Suppose multivariate function $f(x_1, x_2, \ldots, x_n) = \sum\limits_{i=1}^n\ln \left( 1+e^{\,-c_i\sum\limits_{j=1}^n {d_{ij}} x_j} \right)$, where $c_i$ is either $1$ or $-1$ and $d_{ij}$ is arbitrary real number.
I'd like to simplify the gradient of $f(X)$ for the purpose of gradient-based optimization, specifically, I attempt to further simplify $\dfrac{\partial f}{\partial x_k} = \sum\limits_{i=1}^n \dfrac{e^{\,-c_i\sum\limits_{j=1}^n d_{ij} x_j} (-c_id_{ik})} {1+e^{\,-c_i\sum\limits_{j=1}^n d_{ij} x_j }}$.
Of course, if it's possible to simplify in any matrix way, I would be glad to know since many science softwares support matrix algebra. But is it possible to further simplify the gradient?
Use a series of variable substitutions to build up to the function $$\eqalign{ y &= Ax &\ \ \ dy = A\,dx \cr z &= -c\circ y &\ \ \ dz = -c\circ dy \cr e &= \exp(z) &\ \ \ de = e\circ dz \cr h &= 1+e &\ \ \ dh = de \cr g &= \log(h) &\ \ \ dg = \frac{dh}{h} \cr f &= 1:g &\ \ \ df = 1:dg \cr }$$ where the {Hadamard, Frobenius} products have been denote by the symbols {$\,\circ, :\,$} respectively.
Also note the use of $A_{ij}=d_{ij}$ so as to avoid confusion with derivatives or differentials.
Continuing the expansion of that last differential $$\eqalign{ df &= 1:dg \cr &= 1:\frac{dh}{h} \cr &= 1:\frac{e\circ dz}{h} \cr &= 1:\frac{(1-h)\circ c\circ dy}{h} \cr &= (h^{-1}-1)\circ c:dy \cr &= (H^{-1}-I)\,c:A\,dx \cr &= A^T(H^{-1}-I)\,c:dx \cr\cr }$$ where $H={\rm Diag}(h)$.
So the gradient could be written as $$\eqalign{ \frac{\partial f}{\partial x} &= A^T(H^{-1}-I)\,c \cr }$$