Solve the following equation for $x \in \mathbb{R}_+^n$
$$\frac{1}{x} = S x$$
where the inverse on $x$ is taken element-wise, and $S \in \mathbb{R}_+^{n\times n}$ is PSD (or PD, if needed).
This problem comes from trying to optimize
$$L(x) = 1_n^T \log(x) - \frac{1}{2} x^T S x$$
where the logarithm is taken element-wise over $x$, and $1_n$ is just a column-vector of $n$ ones. In other words, we're doing sum(log(x)).
If it helps (otherwise, ignore it), you can assume that $x$ must sum to a constant.
Because this equation is, at worst, quadratic, I was hoping there would be a closed form solution, or at least a very efficient way of computing the solution short of gradient descent.
We have the following system of equations in $\mathrm x > 0_n$
$$\left( \mbox{diag} (\mathrm x) \right)^{-1} 1_n = \mathrm S \,\mathrm x$$
Left-multiplying both sides by diagonal matrix $\mbox{diag} (\mathrm x)$,
$$1_n = \mbox{diag} (\mathrm x) \,\mathrm S \,\mathrm x$$
where the $i$-th (quadratic) equation is
$$1 = x_i \sum_{j=1}^n s_{ij} x_j = \mathrm x^\top \mathrm e_i \mathrm s_i^\top \mathrm x = \mbox{tr} \left( \,\mathrm e_i \mathrm s_i^\top \mathrm x \mathrm x^\top \right) = \langle \mathrm s_i \mathrm e_i^\top , \mathrm x \mathrm x^\top \rangle$$
where $\mathrm s_i$ is the $i$-th row (or column) of symmetric matrix $\rm S$.
Let $\mathrm X := \mathrm x \mathrm x^\top$. Hence, we have $n$ linear equality constraints in $\rm X$, a constraint on the rank of $\rm X$ and a positive semidefiniteness constraint
$$\begin{aligned} \langle \mathrm s_1 \mathrm e_1^\top, \mathrm X \rangle &= 1\\ \langle \mathrm s_2 \mathrm e_2^\top, \mathrm X\rangle &= 1\\ &\vdots\\ \langle \mathrm s_n \mathrm e_n^\top, \mathrm X\rangle &= 1\\ \mbox{rank} (\mathrm X) &= 1\\ \mathrm X &\succeq \mathrm O_n\end{aligned}$$
Since the nuclear norm is a convex proxy for the rank, we solve the following convex program in $\rm X$
$$\begin{array}{ll} \text{minimize} & \| \mathrm X \|_*\\ \text{subject to} & \langle \mathrm s_1 \mathrm e_1^\top, \mathrm X \rangle = 1\\ & \langle \mathrm s_2 \mathrm e_2^\top, \mathrm X \rangle = 1\\ & \qquad\vdots\\ & \langle \mathrm s_n \mathrm e_n^\top, \mathrm X \rangle = 1\\ & \mathrm X \succeq \mathrm O_n\end{array}$$
Since matrix $\rm X$ is symmetric and positive semidefinite, its eigenvalues are equal to its singular values and, thus, the nuclear norm of matrix $\rm X$ is equal to its trace. Hence, we solve the following (convex) semidefinite program (SDP) in $\rm X$
$$\begin{array}{ll} \text{minimize} & \mbox{tr} \left( \mathrm X \right)\\ \text{subject to} & \langle \mathrm s_1 \mathrm e_1^\top, \mathrm X \rangle = 1\\ & \langle \mathrm s_2 \mathrm e_2^\top, \mathrm X \rangle = 1\\ & \qquad\vdots\\ & \langle \mathrm s_n \mathrm e_n^\top, \mathrm X \rangle = 1\\ & \mathrm X \succeq \mathrm O_n\end{array}$$
Lastly, there are two possibilities:
If the optimal solution is rank-$1$, then we have solved the system of $n$ quadratic equations.
If the optimal solution is not rank-$1$, then do try another approach. Good luck.