Regression with arbitrary norm

113 Views Asked by At

I have a function $f(X,Y,Z)$, which I know is polynomial of degree $3$. I have a set of samples $(X_i,Y_i,Z_i)$ and corresponding values of $f$. My task is to find (the best approximation of) the coefficients of the polynomial with the data in hand.

I can do this with least-squares approximation in Matlab but the problem is that least-squares minimizes the Euclidean distance from points to the model (i.e. the error) and I need to use totally different norm for the error than Euclidean distance.

How should I approach this?

In a high level the polynomial handles input and output in some space $A$ but I should choose the coefficients in such a way that the distance (error) in space $B$ would be minimal. The transformation between $A$ and $B$ is not linear, but it doesn't look too complicated (I haven't figured out all the details yet).

If it matters, $X$, $Y$ and $Z$ are in the range $[0,255]$ (in $A$).

1

There are 1 best solutions below

0
On BEST ANSWER

Well, you have to solve the minimization problem $$ \sum_i d_B(f_i,f(X_i,Y_i,Z_i|\theta))\to\min_\theta $$ which of course may not have an explicit solution, though in case $d_B$ is the Euclidean metric you obtain a linear-quadratic problem. As @mpiktas mentioned, general optimization algorithms are needed, e.g. the gradient descent over $\theta$. Unfortunately, I have nothing more to add here.