Can this nonlinear vector equation be solved analytically?

235 Views Asked by At

I have to solve the following vector equation: $$ {\bf Ax} + {\bf b} + {\bf Cx}^{ \circ - 1} = {\bf 0} $$ Where ${\bf b},{\bf x}, {\bf x}^{\circ - 1}, {\bf 0} \in \mathbb{R}^{n}$; ${\bf A},{\bf C} \in \mathbb{R}^{n \times n}$ and $\bf C$ is a diagonal matrix. ${\bf x}^{\circ - 1}$ is the Hadamard (pointwise) product inverse of ${\bf x}$, that means ${\bf x} \circ {\bf x}^{\circ - 1} = {\bf 1} \in \mathbb{R}^{n}$. I thought about converting the equation into a quadratic vector equation and then "completing the square", but all my efforts were in vain. Is there any way of analytically solving this equation? whether by converting the equation into a quadratic one or any other method.

1

There are 1 best solutions below

0
On BEST ANSWER

In principle, you can express the numerator of each entry of the left side as a quadratic, and solve the system of quadratics using a Groebner basis. I doubt that you'll get anything very nice in general: you'll likely end up "reducing" your system to a single polynomial of degree $2^n$ in one of the $x_i$ with unpleasant coefficients.