Calculate the error given a tolerance

1.2k Views Asked by At

I have a noob statistics question.

Is there a function, such that given the residuals from the line of best fit, and a probability, A, it will return B such that there is an A probability of being within a radius B units from the line of best fit? Can standard deviation be used in this case?

For example: I have a set of data predicting how much a rubber band will stretch. I can very the length (independent variable) and measure how far it stretches (dependent variable). I have a linear regression using google spreadsheets functions (something like this). I want to be able to say that there is a 95% chance my rubber band will be within 5 centimeters from the predicted value.

I have done outside research, but it seems that more of statistics is concerned with proving two variables are related, instead of calculating the error given a tolerance probability.

1

There are 1 best solutions below

6
On

Your example is confusing, because the length=stretch, so IV=DV and there will be no error. You'd need something like force vs stretch.

Aside from that, it sounds like you'd like to know what bounds around your fitted line will capture a given proportion of future observations? If so, you want a tolerance interval simple linear regression. However, this is slightly different than what you want, because a tolerace interval has two probabilities associated with it, one for the proportion of future observations that will fall in that interval, and another for the level of confidence that such an interval actually does contain at least that proportion. Your function is more like a probability interval, which you can only get if you know the underlying distribution with certainty.