Calculating probability when failure rate is known

715 Views Asked by At

I've been been developing a web application that features barcode scanning. Before implementing a fix, the failure rate was approximately 50%.

The solution implemented involves scanning the barcode 5 times before determining the correct decoded value.

I was told by a fellow developer that the new failure rate percentage could be calculated by:

$$ (0.5 * 0.5 * 0.5 * 0.5 * 0.5) * 100 $$

Essentially, the assumed failure rate by the number of attempts 100 to get a percentage. The failure rate would now be 3.125%.

Is this correct, if so what formula or principle does this follow?

1

There are 1 best solutions below

2
On BEST ANSWER

You are assuming that each scan has a $50\%$ chance of succeeding and that the chances are independent of each other. Presumably you have some check code in the scan and you declare success if the check code passes. Then a failure is like the chance of flipping a coin five times and getting five tails. You need all the events to happen, so you multiply the probabilities.

The math is fine, but note the assumptions that went into it. Maybe some of your bar codes are messy, so the chance of failure is greater than $50\%$ for them. If half the codes never scan right and half always scan right, you would have an average $50\%$ failure rate, but five scans would still leave a $50\%$ failure rate. If the $50\%$ has been carefully measured it is one thing, but if it is an estimate it may be wrong. If the failure rate on one scan is $60\%$, the chance of five failures goes up to $0.6^5=7.7776\%$, or twice as high even if the scans are independent. Maybe your success rate is unreasonably high because some of the incorrect scans pass the check code.