My textbook has an example of interpolation, but I am not sure how the book did it since it doesn't explain it.
It says if we want $P(Z < 1.246)$ we must use interpolation and the steps given are:
$$P(Z < 1.24) + (6/10)[P(Z < 1.25) - P(Z < 1.24)].$$
Can someone explain to me where the $(6/10)$ came from? Shouldn't it be $(6/1000)$ since the $1.24+0.006 = 1.246?$ I am very very confused about how they got the 6 out of 1.246.
To find $(Z < 1.246),$ possible entry points into the table are 1.24 and 1.25. The distance between them is .01. The distance from 1.24 to 1.246 is .006. Then $.006/.01 = 6/10.$
Here are exact values from R statistical software:
I suppose you get something like 0.5098 from the suggested method. There are two reasons why this 'linear interpolation' method may not give an exact answer: (a) tables are rounded to 4 (maybe 5) places, so there is some rounding error, (b) the normal curve is 'almost' linear over such a short distance, but it is really a curve, not a line.
Note: This method works reasonably well with various kinds of probability tables. In case you encounter them later, linear interpolation works less well with tables of the F distribution, and not at all with tables of the chi-squared distribution when different 'degrees of freedom' are involved.