This question appears without answer in an old exam I found (not a homework question)
Suppose messages that enter a system need to be processed by two servers. They arrive at the system at a rate of
aand queue up to be processed byServer1.Server1processes the messages at a rate ofu1. Then they proceed toServer2where they queue up until they are served, at a rate ofu2. The queues can be infinite.What is the probability that there will be
kmessages inServer1's queue?
I'm confused about how exactly to go about this. There are an infinite number of ways to have some given quantity of messages in the queue. Somehow I have to reconcile the amount of cases each multiplied by their probability, which I am also unsure of how to get to exactly.
I drew a Markov diagram to help me a little:

One can forget Server 2. The length of the queue at Server 1 increases by $1$ at rate $a$ and, when positive, it decreases by one at rate $u_1$. This is the most classical birth-and-death chain, whose stationary distribution exists if and only if $a\lt u_1$ and can be readily found, either reading your textbook or solving the associated system of linear equations.