I'm studying Machine Learning and Artificial Neural Networks. Some basic principles of Machine Learning are linear regression, multivariate linear regression, and nonlinear regression. The last of these, nonlinear regression, involves fitting a curve to a set of data. This curve can be any shape (it might fail the vertical and horizontal line test).
In the tutorial I am following, Stanford Machine Learning by Andrew Ng, he says that the sigmoid function is used for non linear regression, and the more polynomial terms used the more complex the curve can be.
I'll give an example:
$$y=\frac{1}{1+e^{-\left(100x^2-100y\right)}}$$
approximates the polynomial:
$$y=x^2$$
The sigmoid function can also approximate curves that are relations instead of functions. For example:
$$y=\space \frac{1}{1+e^{-\left(5+1x\space -3y\space +4xy\space -1yx^2\space -2yx^3-1xy^2\space +30y^3\right)}}$$
But if I use enough polynomial terms can this function approximate any function or relation?
It seems like this needs to be true otherwise the sigmoid function would be a bad choice of hypothesis for nonlinear regression.