Is It Possible to Estimate Process Time from Time to Resolution.

18 Views Asked by At

I'm looking at an operational workflow (should be first in-first out, but there are exceptions). Time isn't logged per work item, but I do have metrics about how long a piece of work stayed in a queue. Team size can fluctuate by day.

I'm looking at two pieces of data: 'resolution time' is the total time a piece of work sits in the department, even if it is not being worked on. 'Process time' is the time it actually took to perform the work, and is the metric I'd like to estimate.

If work arrived in a steady stream, it would be easy to compute process time by averaging resolution time.

  • Item 1 - 1 hour
  • Item 2 - 1 hour

Average: 1 hour

What I discovered was that front loaded work makes this calculation unreliable.

For example, if every piece of work took an hour to do, but I received 5 pieces of work at the same time, the resolution times recorded by the system would be:

  • Item 1 - 1 hour
  • Item 2 - 2 hours (1 hour sitting in queue + 1 hour process time)

And so on. Average (for 5 items): 3 hours

I'd like to estimate with some confidence the process time for each work item. Previously, I'd collected work logs - but this data is out-of-date and the collection method is slow and burdensome.

Is it possible to computer process time from time to resolution?