Is it possible, based on a given turnover rate per year (in %), to calculate the average time (in years) an employee stays in an organization?
I have looked at the Coupon Collector's problem:
https://en.wikipedia.org/wiki/Coupon_collector's_problem
but so far haven't been able to figure this out.
Turnover could basically be calculated like this:
This strikes me as application of Little's Law $$L = \lambda W$$ where $L$ is the long-term average of the number of customers in a queuing system, $\lambda$ is the arrival rate, and $W$ is the average time a customer spends in the system. Assuming the company size is stable, the turnover is the arrival rate of employees divided by the number of employees, i.e. turnover is $$\tau = \lambda / L$$ so by Little's Law, $$W = L / \lambda =1/ \tau$$ Put in words, the average time in the company is the reciprocal of the turnover rate.
I don't see any connection with the Coupon Collector's Problem.