In linear regression we use gradient descent to find the global minimum provided that the cost function is a convex.
J = x0 + x1*theta1 + x2*theta2
But if we want to use multipliers of features as features as well,
J = x0 + x1*theta1 + x2*theta2 + x1*x2*theta3 + x1^2 * theta4 + x1^2 * x2^2 * theta5 ....
will the new cost function be a convex? If not as for my knowledge we can't guarantee that we get the global minimum. If so what is the solution in such cases (except using neural networks)?
Everything else on this page is quite old so perhaps no one else will see this, but while I was drafting a question SO showed me this question as being related.
Anyway, I want to point out that while it's true that terms of the form $x_1 x_2 \theta_3$ make your new function non-convex in $x$, depending on the required domain, it would be log-log convex.
You may be interested in "disciplined convex programming", which is a set of rules for constructing functions that are assured to be convex. Closely related is "disciplined geometric programming" which constructs functions that are guaranteed to be log-log convex. In these, multiplication of optimization variables is an allowed operation, but terms must be positive, and subtraction is disallowed.
Because the original question was asked with what looks like a code snippet, I'll point out that cvxpy is a python package that supports both disciplined convex programming and disciplined geometric programming, and has good documentation which explains both in greater detail. The relevant portion for log-log convexity is here: https://www.cvxpy.org/tutorial/dgp/index.html