Percentage of percentage to get a normalized trend

278 Views Asked by At

I am not really good at math and just would like to understand whether what I make makes sense:

I have two different teams who estimate their effort while working on two identical projects. Team A estimates the work to be 1000 hours. Team B estimates the work to be 450 hours.

Now, in the middle of the project:
Team A worked 500 hours and made 10 mistakes = 2%
Tam B worked 225 hours and made 7 defects = 3,1%

Although team B finished the same amount of work, they finished more quickly according to the their lower estimate, so their results looks worse.

They continue and at the end of the project:
Team A worked 1000 hours and made 17 mistakes in total= 1,7%
Tam B worked 500 hours and made 12 defects in total = 2,4%

So I cannot compare results between the teams because the estimates are too different. So I believe the only thing I can do is to measure how the results change, such as:

Team A mistake rate declined by 15% (from 2 to 1,7%)
Team B mistake rate declined by 23% (from 3,1 to 2,4)

Would this make sense to at least understand the trend?

1

There are 1 best solutions below

2
On BEST ANSWER

To me, saying that team $A$ made "$2\%$" mistakes makes no sense.

A percentage is something that is used to compare a certain part of something else. For example, if I have 5 apples, then one of those apples is $20\%$ of all apples. I get the number $20$ by dividing the number of chosen apples, $1$, with the number of all apples, $5$, and multiplying by $100$.


What you are doing is taking the number of mistakes and the number of hours and you are dividing them, but that makes no sense calling it a "percentage". What you are talking about are "mistakes per hour", not "percentages".

Calling them mistakes per hour, you can then conclude that the amount of mistakes per hour for team $A$ dropped by $15\%$ and for team $B$ dropped by $23\%$.