Exercice about Kolmogorov forward equation and probability-generating function

82 Views Asked by At

I am studying about the infinitesimal generator and in the course of my reading I found the following exercise which is stated as follows:

Let be $X_t$ a Markov process.

We are now interested in the moments of $X_t$ for $t\geq 0$.

We note $(P_t)_{t\in\mathbb{R}_+}$ the semigroup of $X$ with $P_t = (P_t(x, y))_{x,y∈\{0,\dots ,N\}}$ and $\mu_x^{(k)} := E_x [X_t^k] := \sum_{y=0}^{N}P_t(x,y)y^k,\quad k\geq1,\;t\in\mathbb{R}_{+},\;0\leq x\leq N$

Convention: We pose $P_t(x,-1) = P_t(x,N+1)=0$ for $0\leq x\leq N$

We have that for $x,y\in\{1,\dots N\}$

$\frac{dP_t(x,y)}{dt}=P_t(x,y-1)\left(\beta(y-1)-\frac{\beta}{N}(y-1)^2\right)-P_t(x,y)\left((\beta+1)y-\frac{\beta}{N}y^2\right)+P_t(x,y+1)(y+1)$

We note that $\beta(y-1)-\frac{\beta}{N}(y-1)^2=0$ for $y=N+1$

Prove that

$\mu_x^{(1)}(0)=x,\quad \dfrac{d\mu_x^{(1)}(t)}{dt}=(\beta -1)\mu_x^{(1)}(t)-\frac{\beta}{N}\mu_x^{(2)}(t).$