How to quantify the differencen between 2/4 and 20/40?

1k Views Asked by At

Assume I have two methods to do prediction. The first method makes 4 predictions and 2 out of 4 are correct. The second method makes 40 predictions and 20 out of 40 are correct. The prediction precisions of both methods are the same, which is 2/4=20/40=0.5. But I think the second method is better than the first one, because it makes more correct predictions. Is there a measure to quantify this? Any suggestion may help:) Thanks in advance.

1

There are 1 best solutions below

4
On BEST ANSWER

We want to estimate the probability of success when we make a prediction. We are using the estimator $S/N$, where $N$ is the number of trials and $S$ is the number of successes. In the case you describe, both estimates are $\frac{1}{2}$.

However, the variance of the estimator when $N=40$ is $\frac{1}{10}$ times the variance when the estimator is based on $N=4$. Or equivalently the standard deviation of the estimator when $N=40$ is $\frac{1}{\sqrt{10}}$ times the standard deviation when $N=4$. The estimate based on $40$ trials is more reliable than the estimate based on $4$ trials. Variance (or standard deviation) is a way of quantifying how much more reliable.