Statistical Accuracy

92 Views Asked by At

I was reading about statistical filters, and I came across a sentence that I didn't understand:

You might be thinking that there's not much difference between 95 percent and 99.5 percent. But you would be wrong. A filter with a 99.5 percent accuracy rate is not merely 4.5 percent more effective than one that is 95 percent accurate, but 900 percent more effective!

Can someone explain how a filter that is 99.5 percent accurate is 900 percent more effective than 95 percent?

1

There are 1 best solutions below

0
On BEST ANSWER

Let's consider a system with a $95~\%$ certainty. It means that for every $10~000$ inputs (or cases, whichever you want), there would be about $(1-0.95)\times 10~000\approx 500$ errors.

Then compare it to a system with $99.5~\%$ certainty. Now the number is $(1-0.995)\times 10~000\approx 50$.

The first system has $\frac{500-50}{50}\approx 900~\%$ more errors. That's what the text is trying to communicate.

PS: I put the "approximate" signs to indicate the expected number of errors.