I recently learned how to apply the least squares method to do linear regression. I also understand that it can be used for quadratic regression, by minimizing the error for three variables, two coefficients and a constant, instead of two variables. Would the same method apply to most, or all, types of equations? Could I simply assume coefficients wherever possible, and a constant, then find the partial derivative with respect to each, then set them equal to zero and solve? For example, could I regress to *a*log(*b*x)+c? Could I use logarithms, sine waves, exponential function, etc? If not, what are the exceptions? Where is this method not possible? Why?
Thanks in advance for all responses.
My above comment: Yes, but then you have to distinguish between linear and non-linear least squares. Both of these are solved differently, depending on the nature of the relationship.
In response to your question of an example, some are given below. Note that I will use $a,\,b$ and $c$ as the coefficients to be determined, $x$ as the independent/predictor variable and $y$ as the dependent/response variable.
Linear examples: $$\begin{align} y&=a+bx\\ \ln y&=a+b\ln x \quad(\text{equivalent to the nonlinear form } y=e^ax^b)\\ y^2&=a+bx^2-ce^x \end{align}$$ These are linear because the equations are linear in the unknown coefficients.
Nonlinear examples: $$\begin{align} y&=ax^b+c\\ y&=a\sin\left(bx+c\right)\\ y&=\frac{x+a}{x+b}\quad(\text{equivalent to the linear form } ay-b=x-xy) \end{align}$$ These are non-linear because the equations are non-linear in the unknown coefficients.