The text I am reading says that the infinitesimal generator of the waiting time process in a M/G/1 is given by $$Gf(x)=\lambda \int_0^\infty [f(y+s)-f(y)]dF(s)-f'(y)1(y>0)$$ where $F$ is the distribution of the service time. Can anyone please explain how this is derived. My intuition is that the first term is due to the arrival process and the second term corresponds to the departures, but I don't know how to derive it. I am particularly interested if this can be extended to G/G/1 queue case. Any reference which discusses infinitesimal generators in the context of queues will be appreciated
2026-03-27 15:35:43.1774625743
Generator of waiting time process in a queue
283 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in STOCHASTIC-PROCESSES
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
- Probability being in the same state
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Why does there exists a random variable $x^n(t,\omega')$ such that $x_{k_r}^n$ converges to it
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- What is the name of the operation where a sequence of RV's form the parameters for the subsequent one?
- Markov property vs. transition function
- Variance of the integral of a stochastic process multiplied by a weighting function
Related Questions in QUEUEING-THEORY
- How to determine queue distribution?
- Reasonable/unreasonable exponentially distributed interarrival (service) times
- Fixed-sized length/ M/ 1 queuing model
- Time to be serviced in a queue
- Effect on wait time of combining queues
- Discrete time queue markov chain
- M/M/1/K Queue vs M/D/1/K Queue
- How can I find the probability of $n$ customers waiting in a queue?
- Understanding M/M/1 queue simulation
- M/M/1 with balking
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
As I recall, the arrival process for the M/G/1 queue is a Poisson process of rate $\lambda$, and the service time is a general random variable $S$.
After staring a great deal at the formula for the infinitesimal generator, I must admit that my strategy in solving this question went backwards : I used the generator formula to guess what $W_t$ might be like, and then ensured that the required model fit with known models.
Anyway, the infinitesimal generator of a Markov process $X_t$ is an operator $A$ (whose domain consists of "nice enough" functions $f$) such that $$ Af(x) = \lim_{t \to 0} \frac{\mathbb E_xf(X_t) - f(x)}{t}, $$ where $\mathbb E_x$ denotes the probability distribution on the paths of $X_t$ conditioned on $X_0=x$. Roughly speaking, the idea is the following : even if $X_t$ is not differentiable with respect to $t$ (we can't even define what this means if $X_t$ takes values in some weird space), we can always consider the process $f(X_t)$ for a real valued function $f$ defined on the state space of $X$ and this takes values in the real numbers. We intend to capture the instantaneous rate of change of $X$ by capturing the rates of changes of these functions of $X$ instead.
Our task is to find the infinitesimal generator of the waiting time process $W_t$ of an $M/G/1$ queue. Remember that the waiting time process takes values only in the set of non-negative real numbers.
Let us first make sure that we have the distribution of $W_t$ right. Suppose that $W_0=y \geq 0$. Let $t>0$ be arbitrary. $W_t$ is the waiting time for the queue at time point $t$. We must compare this to the waiting time at time $0$, namely $y$.
To do this, observe that after time $t$, certainly $t$ seconds have elapsed. At this point, we must make two considerations : either $y\geq t$ or $y<t$. If $y>t$ then the person who came in at time $0$ is has still not been serviced at time $t$, and if $y<t$ then the person who came in at time $0$ has been serviced at time $y-t$.
Now, between the time points $0$ and $t$, some customers would have arrived, increasing the waiting time. How many such customers have arrived? Well, it depends on the arrival process, which in this case is Poisson, but we'll use a general arrival process $G(t)$ to denote it. This ensures that we are actually studying the $\mathbf{G/G/1}$ situation for as long as possible, answering your bountied request to the best of my ability.
The number of arrivals at time $t$ is some random variable $G(t)$. Each such arrival will require $S_i$ time to be served, where $S_i$ has the same distribution as $S$. Furthermore, each customer's associated $S_i$ is independent of the others. All in all, one sees that for a $\mathit{G/G/1}$ queue,$$ W_t = \max\{y-t,0\} + \sum_{i=1}^{G(t)} S_i. $$ The second part reflects the service times of arrivals in $[0,t]$ (it's $0$ if $G(t)=0$). The first part says that if $y\geq t$, then at time $t$, there is still $y-t$ amount of time left till this person is served, elsewise $y<t$ implies that at time $t$ the person being served at time point $0$ doesn't need to be counted in the waiting time calculations : they've already been serviced.
That's how the formula for a $G/G/1$ queue looks like. Now, let's see how we can find the infinitesimal generator.
The idea is the following : clearly, for any function $f$, $$ f(W_t) - f(y) = f\left(\max\{y-t,0\} + \sum_{i=1}^{G(t)}S_i\right)-f(y) $$ by merely applying $f$ to both sides of the definition of $W_t$ and then subtracting $f(y)$ from both sides. Applying the expectation (conditioned on $W_0=y$) on both sides of this equation,$$ \mathbb E_y (f(W_t)-f(y)) = \mathbb E_y\left[f\left(\max\{y-t,0\} + \sum_{i=1}^{G(t)}S_i\right)-f(y)\right]. $$
We must obviously work with the right hand side. We do what is expected to be done in this case : condition on $G(t)$, which is a random variable that represents the number of arrivals between time points $0$ and $t$, hence is a non-negative integer. Basically, by the law of iterated expectation, $$ \mathbb E_y\left[f\left(\max\{y-t,0\} + \sum_{i=1}^{G(t)}S_i\right)-f(y)\right] \\ = \sum_{n \in \mathbb N} \mathbb P(G(t)=n) \mathbb E_y \left[f\left(\max\{y-t,0\} + \sum_{i=1}^{n}S_i\right)-f(y)\right]. $$ At this point, recalling the definition of the infinitesimal generator,$$ Af(y) = \lim_{t \to 0} \frac{\mathbb E_y f(W_t) - f(y)}{t} = \lim_{t \to 0} \frac 1t \mathbb E_y(f(W_t) - f(y))\\ = \lim_{t \to 0} \sum_{n \in \mathbb N} \frac{P(G(t)=n)}{t} \mathbb E_y \left[f\left(\max\{y-t,0\} + \sum_{i=1}^{n}S_i\right)-f(y)\right]. $$
This is a general formula for the infinitesimal generator of the waiting time process of the G/G/1 queue. We still haven't used the fact that the arrival process is Poisson : but we will do that shortly.
The first thing we do is assume that we can switch the limit and the sum. Now, this can be justified for $G$ being Poisson : but in general, I am not the most sure regarding why it can be done. To justify this for $G=N$ being Poisson, let me just proceed with the switch.
Switching the two of them, $$ \lim_{t \to 0} \sum_{n \in \mathbb N} \frac{P(N(t)=n)}{t}\mathbb E_y \left[ f\left(\max\{y-t,0\} + \sum_{i=1}^{n}S_i\right)-f(y)\right] \\ = \sum_{n \in \mathbb N} \lim_{t \to 0}\left(\frac{P(N(t)=n)}{t}\right)\mathbb E_y \left[ f\left(\max\{y-t,0\} + \sum_{i=1}^{n}S_i\right)-f(y)\right]. $$
Observe that $\mathbb P(N(t) = n) = \frac{e^{-\lambda t}(\lambda t)^n}{n!}$. In particular, $\lim_{t\to 0}\frac{\mathbb P(N(t) = 0)}{t} = +\infty$, $\lim_{t\to 0}\frac{\mathbb P(N(t) = 1)}{t} = \lambda$ and $\lim_{t\to 0}\frac{\mathbb P(N(t) = k)}{t} = 0$ for all $k>1$. (In the case of a general process $G$, you will have to study the limits $\lim_{t \to 0}\frac{\mathbb P(G(t)=n)}{t}$ and see how quickly these go to zero. Then you will understand which of the limits remain).
Using the above observation, we break the sum over $n \in \mathbb N$ into three pieces : one for $n=0$, one for $n=1$ and one for the the rest. In the first sum, we transfer the $\frac 1t$ from the $N$ term to the term containing the expectation : you'll see why. In the other terms we retain it. $$ \sum_{n \in \mathbb N} \lim_{t \to 0}\left(\frac{P(N(t)=n)}{t}\right)\mathbb E_y \left[ f\left(\max\{y-t,0\} + \sum_{i=1}^{n}S_i\right)-f(y)\right].\\= \lim_{t \to 0}\left(P(N(t)=0)\right)\mathbb E_y \left[\frac{f\left(\max\{y-t,0\}\right)-f(y)}{t}\right] \\ + \lim_{t \to 0}\left(\frac{P(N(t)=1)}{t}\right)\mathbb E_y \left[f\left(\max\{y-t,0\} + S\right)-f(y)\right] \\ + \sum_{n \in \mathbb N} \lim_{t \to 0}\left(\frac{P(N(t)=n)}{t}\right)\mathbb E_y \left[ f\left(\max\{y-t,0\} + \sum_{i=1}^{n}S_i\right)-f(y)\right]. $$
Now, the last term goes to $0$. The first term, the $P(N(t)=0)$ goes to $1$, and $$\lim_{t \to 0}\mathbb E_y \left[\frac{f\left(\max\{y-t,0\}\right)-f(y)}{t}\right] = \mathbb E_y \left[\lim_{t \to 0} \frac{f\left(\max\{y-t,0\}\right)-f(y)}{t}\right]$$ (exchange of limit and expectation justified by e.g. assuming that $f'$ is uniformly bounded). If $y=0$ then this quantity is $0$ : otherwise it's equal to $-f'(y)$ as you can easily see. Therefore, the limit of the first term is $-f'(y) 1_{y>0}$.
How about the second term? Here, $\lim_{t \to 0}\left(\frac{P(N(t)=1)}{t}\right) = \lambda$. On the other hand, $$ \mathbb E_y \left[ f\left(\max\{y-t,0\} + S\right)-f(y)\right] = \int_{0}^{\infty} (f\left(\max\{y-t,0\} + s\right) - f(y))dF(s) $$ as $t \to 0$, using the dominated convergence theorem (noting that $f$ is uniformly bounded), this converges to $$\int_{0}^{\infty} (f\left(\max\{y,0\} + s\right) - f(y))dF(s)=\int_{0}^{\infty} (f\left(y + s\right) - f(y))dF(s).$$ Thus, the first term converges to $\lambda\int_{0}^{\infty} (f\left(y + s\right) - f(y))dF(s)$.
All in all, $$ \bbox[border:2px solid red] {Af(y) = \lambda\int_{0}^{\infty} (f\left(y + s\right) - f(y))dF(s) - f'(y)1_{y>0}}, $$ as desired.
Recall that I mentioned earlier that the only things that matter earlier were the quantities $\lim_{t \to 0} \frac{G(t)=n}{t}$. For $n=0$, you can transfer the first of these to the derivative part like we did with $G=N$. For $n\geq 1$, though, (to me at least) it is not obvious that the limits will all have some pattern, so once you find those limits, then you get the infinitesimal generator for the waiting time of the G/G/1 process, because for $n \geq 1$, as $t \to 0$, $$ \mathbb E_y \left[ f\left(\max\{y-t,0\} + \sum_{i=1}^n S\right)-f(y)\right] \to \int_{0}^{\infty} (f\left(y + s\right) - f(y))dF^n(s), $$ where $F^n$ is the distribution function of the random variable $\sum_{i=1}^n S_i$ and is related to the CDF of $S$ by some convolution identity. Hence, the limits of these quantities always exist, and you must merely investigate $\lim_{t \to 0} \frac{P(G(t)=n)}{t}$ for each $n \geq 1$ to find the infinitesimal generator of the $G/G/1$ queue with arrival process $G$ and service time $S$.