To be able to analyse the processes in an organizational department, I had the idea that the processes can be modelled as a queueing system. Now suppose the service times are exponentially distributed.
We can now consider every process or task as a server. A problem occurs if more than one person spends some of his or her time on this task. This basically means that every server is a collection of machines with different service times.
Suppose that during a certain time period, person A spends 20% of his or her time on "process 1" and person B spends 30% of his time on "process 1".
Is it now mathematically correct to state that, since the service time of person A and B are independent, that the average service time is now the sum of the service time of person A and person B times the timefactor that respresents the percentage of time they spend on that process?
In mathematical terms:
$$ \mu_1 = r_{A,1}*\mu_{A,1} + r_{B,1}*\mu_{B,1} \\ $$
with $$r_{p,1}$$ the ratio of time that person p spends on process 1 and $$\mu_{p,1}$$ the average service time of person p in process 1.
The answer seems to be yes, since one can add up the service times of exponential distributions as long as they are independent. However, this seems to be an exceptional situation. Can somebody confirm this?