So I'm a bit lost on what approach to take to calculate the percentage that shows the truth.
Imagine we have the following table:
+----------+-------+--------------------------------+----------------------------------+
| RESOURCE | LEVEL | DESIRED HOURS EACH DAY (MAX 8) | AVAILABLE HOURS EACH DAY (MAX 8) |
+----------+-------+--------------------------------+----------------------------------+
| 1 | ONE | 8 | 0 |
| 2 | TWO | 5 | 8 |
| 3 | THREE | 6 | 4 |
| 4 | ONE | 4 | 2 |
| 5 | TWO | 6 | 2 |
| 6 | TWO | 7 | 7 |
| 7 | THREE | 0 | 8 |
+----------+-------+--------------------------------+----------------------------------+
I want to know - by LEVEL - what percentage of LEVEL I have at my disposal. So for RESOURCE 1 I would need 8 hours each day, but there are 0 hours available. However, RESOURCE 4 is also LEVEL ONE who is available for 2 hours. So this would be an average of
((0-8 = -8) + (2-4 = -2) / 2)/8) which makes it -0.625
Or should it be:
((8/8 - 0) + ((4/8) - (2/8))) / 2 which makes 0.125
Because it calculates first the percentage of availability per day and per desired hours and then subtracts that from each other. I am lost to which one should be used to know how much availability I have compared to the desired hours I have.
One calculation shows minus 60% and the other one positive 12.5%, any help is appreciated.
And what happens in case someone is available 8 hours (as in RESOURCEs 2 and 7), would that need to be 100% in any case?