I wrote some python that applies the SVD, then uses SVD to find the best fit for some matrix A. Where A times some "best fit" x is the best solution for a output vector, b.
Using SVD and my made up Ax = b \begin{equation} A = U \Sigma V^* \: and \: Ax =b\;\; , \end{equation} then (I think) \begin{equation} V \Sigma^{-1} U^* (Ax) = V \Sigma^{-1} U^* (b) \;\; , \end{equation} eliminates A, so \begin{equation} x = V \Sigma^{-1} U^* (b) \;\; , \end{equation} x is the best fit vector. I hope my equations are correct, I translated it from my code and it's a little bit of python packages and a little my own, but I do get the vector x.
So when I mess around, I make A = column vector that's (0 to 50), and my "b" output vector (0-50)*7. The SVD found the best fit vector x = [7]. Makes sense because my equation is f(i) = 7*i
When I make A with two dimensions, both (o to 50), and my output b = (0 to 50)*7 + (0 to 50)*4. My best fit vector is [7,4]. Sovles the for x correclty because f(i,j) = 7*i + 4*j.
I then make A with three dimensions and an output (0 to 50)*7 + (0 to 50)*4 + (0 to 50)^2. The best fit x fails for the function f(i,j,k) = 7*i + 4*j + k^2. Is the method I'm using to find the best fit x, only work for linear inputs? Is there a way to use this best fit method solution for polynomials?