I am considering a queuing model of the form $M/M/\infty$, you find properties of this queue here: http://en.wikipedia.org/wiki/M/M/%E2%88%9E_queue
I am interested in the average busy period of this model, i.e the average time interval during which at least one server is busy. I am a little bit confused here how this should like with an infinite amount of servers.
Wikipedia gives the formula $$\frac{1}{\lambda}\sum_{i\gt c} \frac{c!}{i!}\left( \frac{\lambda}{\mu} \right)^{i-c}$$ for the length of time the process spends above a fixed level $c$, starting timing from the instant the process transitions to state $c+1$.
You are asking for the case $c=0$, which gives $$\frac{1}{\lambda}\sum_{i \gt 0} \frac{1}{i!}\left( \frac{\lambda}{\mu} \right)^{i} = \frac{1}{\lambda}(e^{\lambda/ \mu}-1)$$