$Q$-matrices and Markov Chain properties

68 Views Asked by At

I would like to start with the definition of a $Q$-matrix on a countable set $I$. A $Q$-matrix is a matrix $Q=(q_{ij}:i,j\in I)$ satisfying the following conditions:

(i) $0\le-q_{ii}<\infty\forall i$

(ii) $q_{ij}\ge 0$ for all $i\not= j$

(iii) $\sum_{j\in I}q_{ij}=0$ for all $i$

Now I come to my two problems: Let $f:S\rightarrow\mathbb R$ be a function, identified with the vector $(f(x))_{x\in S}$ and $Q$be a $Q$-matrix, then it follows that $$Qf(x)=\sum_{y}q_{xy}(f(y)-f(x))$$

Where does this come from?

Let $\phi(t)=\mathbb E_x(f(X_t))$ where $(X_t)$ is a Markov chain with $Q$-matrix given by $Q$, then we have $$Qf(x)=\phi'(0)=\lim_{t\rightarrow 0}\frac{\mathbb E_x(f(X_t))-f(x)}{t}$$

The second equality seems to be trivial in my eyes since its just the definition of differentiation but the first $Q f(x)=\phi'(0)$ is not clear to me, may you can help me with that.

1

There are 1 best solutions below

1
On BEST ANSWER

By definition $Qf(x)=\sum_{y}q_{xy} f(y)$. But the row sums of $Q$ are zero so $\sum_{y} q_{xy}f(x)=0$, so subtracting gives $Qf(x)=\sum_{y}q_{xy} (f(y)-f(x)).$

We can also re-write $$ \frac{\mathbb E_x(f(X_t))-f(x)}{t} =\sum_{y\neq x} (f(y)-f(x))\, {1\over t}\mathbb{P}_x(X_t=y).$$ By definition of the $Q$-matrix, we have for $x\neq y$ $$q_{xy}=\lim_{t\downarrow 0} {1\over t}\mathbb{P}_x(X_t=y),$$ which gives the result.


Added: The equation in your comment is not quite correct. What you want follows from $$\mathbb{E}_x(Qf(X_s))=\phi^\prime(s),$$ and $$\phi(t)=\phi(0)+\int_0^t \phi^\prime(s)\,ds.$$