When you have a set of raw data, there are many models to extrapolate a projection. Linear, Quadratic, Logarithmic, Etc.
How does one know which to use? I'm assuming its by the nature of the data source itself but if that is unknown what is the best method?
For the sake of example and as a discussion point, we have world GDP growth rate.
GDP growth rate is correlated to population growth rate I'd imagine which I believe is logarithmic, so therefor logarithmic model could be used for projections (granted there are other factors like life-span, automation, etc that could be considered and might not be of minimal impact). If this is the case is logarithmic a sufficient model, or does the exponential growth rate of automation need to be accounted for as well?
Additionally, what if the data set had been random and I had not known its source and therefor been able to correlate it to population growth and Fibonacci, is there a good set of standards for determining which projection model to use or is it up to interpretation?
I'm assuming one can look at the error-rate of how far a data point is from the extrapolated function and "rank" it then see which projection model had the minimal amount of error? But this might not always be best, especially with a low sample size.