2D Polynomial regression with Condition

1.7k Views Asked by At

Is there a method for polynomial regression in 2D dimensions (fitting a function f(x,y) to a set of data X,Y, and Z)? And is there a way to apply a condition to the regression in 2D that requires all functions fitted to go through the axis line x=0?

1

There are 1 best solutions below

2
On BEST ANSWER

Linear regression works for polynomials of degree $n$. There's also non-linear regression for "any function", but I think you want polynomials.

In practice there are recommendations about how many parameters you should fit to a model: problems arise with over fitting.

Suppose you want to fit a polynomial based on a single parameter matrix $X$, we can write $$X = \begin{pmatrix} 1 & x_1 & x_1^2 & \dots & x_1^n \\ 1 & x_2 & x_2^2 & \dots & x_2^n \\ \vdots & \vdots & \vdots & \cdots & \vdots\\ 1 & x_m & x_m^2 & \cdots & x_m^n \end{pmatrix}$$

Then you want a coefficient vector to best fit the data, call it $\beta$. We want to find $\hat \beta$ which best fits our data. In simple linear regression, we might consider a model $Z = X\beta + \epsilon$ where $\epsilon$ is an error term. We want to find the equation best fits the data and so has the smallest value of epsilon.

It turns out that the answer is $\hat \beta = (X^TX)^{-1}X^Tz$.

Now, if we wanted to use both variables, we could just make a new matrix which instead of only include the $x$'s, we also stuff all the $y$'s in there. If you wanted you could define a new matrix $Y$ and solve $Z = X\beta_x + Y\beta_y$ and basically get the same thing:

$$\hat \beta_x = (X^TX)^{-1}X^Tz$$

$$\hat \beta_y = (Y^TY)^{-1}Y^Tz$$ We end up solving the equation $Z = X\beta$ our vector $z = X\beta$

The $\hat \beta$'s are our coefficient vectors for our polynomials.

If you wanted them all to go through the origin, you'd remove the columns with ones.