I have a system of cubic equations:
$$0=A_0+A_1 x+A_2 ( x \otimes x ) + A_3( x \otimes x \otimes x )$$
where $\dim A_0 = \dim x$ (so there are as many equations as unknowns). You may assume that the system has finitely many solutions, and at least one.
I seek an efficient numerical algorithm that is guaranteed to converge to the solution with minimum $L_2$ norm (i.e. that for which $\| x \|_2$ is lowest).
One algorithm would be to use the Homotopy continuation method to find all the solutions, then to just pick the one with lowest norm. However, my hunch is that a much faster algorithm ought to be possible, either by exploiting the fact that with $x$ small, $0 \approx A_0 + A_1 x$, or by repeated numerical minimisation of:
$$\|A_0+A_1 x+A_2 ( x \otimes x ) + A_3( x \otimes x \otimes x )\|_2 + \lambda \| x \|_2 $$
where $\lambda$ is gradually reduced from a very large value to $0$.
Has this problem been analysed before? If so, directions to the relevant literature would be a great help. As indeed would comments on my hunches, or further ideas.
Thanks in advance for any assistance.
This system is a ordinary system of $n$ polynomials of degree $3$ in $n$ variables written in a fancy form. So it may have up to $3^n$ solutions.
The usual techniques apply, available are Gröbner based methods if the matrices have rational entries and homotopy continuation methods like LHCPack. Project-Lift-Intersect methods are becoming available, but are not commonly used.