generating function of numbers of visits up to time n

104 Views Asked by At

I want to determine the generating function of the numbers of visits up to a specific time. Formally: let $\{X_{n}\}_{n \geq 0}$ be a markov chain on the state space $S = \{1,\ldots,r\}$ with transition matrix $P = (p_{i,j})_{1 \leq i,j \leq r}$ and initial probability vector $\alpha = (\alpha_{i} : i \in S)$ with $\alpha_{i} = \mathbb{P}(X_{0} = i)$. Let $q(n_{1},\ldots,n_{r})$ be the probability that up to time $n-1$ the state $1$ is visited $n_{1}$-times,$\ldots$,state $r$ is visited $n_{r}$-times and $F_{n}(x_{1},\ldots,x_{r}) = \sum_{n_{1}+ \cdots +n_{r} = n}q(n_{1},\ldots,n_{r})x_{1}^{n_{1}}\cdots x_{r}^{n_{r}}$ be the corresponding generating function of $q(n_{1},\ldots,n_{r})$. Now I want to show that $$ F_{n}(x_{1},\ldots,x_{r}) = (\alpha_{1}x_{1},\ldots,\alpha_{r}x_{r}) \left( P D_{r}\right)^{n-1} e, $$ where $e = (1,\ldots,1)^{\intercal}$ and $D$ is given by $$ D = \begin{bmatrix} x_{1} && \\ & \ddots &\\ & & x_{r} \end{bmatrix}. $$ Here is my solution: $$ \begin{align*} \sum_{n_{1} + \cdots + n_{r} = n}q(n_{1},\ldots,n_{r}) &= \sum_{j \in I}\mathbb{P}(X_{n-1} = j) \\ &= \sum_{j \in I}\sum_{i \in I}\alpha_{i}p_{i,j}^{n-1} \\ &= \left(\sum_{i \in I}\alpha_{i}p_{i,1}^{n-1},\ldots,\sum_{i \in I}\alpha_{i}p_{i,r}^{n-1} \right) e \\ &= \alpha P^{(n-1)} e. \end{align*} $$ Does this suffice as a proof and is it correct? Thanks for any help!

1

There are 1 best solutions below

0
On

Your computation is correct but it only proves the special case $x_1 = \ldots = x_r = 1$. To prove the general case, set $$A(n_1, \ldots, n_r) = \left\{\sigma \colon \{0, \ldots, n-1\} \to I \; \middle| \; \#\sigma^{-1}(i) = n_i\right\}.$$ Then $$\begin{align*} q(n_1,\ldots, n_r) &= \mathbb{P}\big(\exists \sigma \in A(n_1, \ldots, n_r), \, X_k = \sigma(k) \text{ for } 0 \leq k \leq n-1 \big) \\ &= \sum_{\sigma \in A(n_1, \ldots, n_r)} \mathbb{P}\big( X_k = \sigma(k) \text{ for } 0 \leq k \leq n-1\big) \\ &= \sum_{\sigma \in A(n_1, \ldots, n_r)} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)). \end{align*}$$ Now we get $$\begin{align*} F_n(x_1, \ldots, x_n) &= \sum_{n_1 + \ldots + n_r = n}q(n_1,\ldots, n_r) x_1^{n_1} \cdots x_r^{n_r} \\ &= \sum_{n_1 + \ldots + n_r = n} \bigg(\sum_{\sigma \in A(n_1, \ldots, n_r)} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)) \bigg)x_1^{n_1} \cdots x_r^{n_r} \\ &= \sum_{n_1 + \ldots + n_r = n \\ \sigma \in A(n_1, \ldots, n_r)} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)) x_1^{\#\sigma^{-1}(1)} \cdots x_r^{\#\sigma^{-1}(r)} \\ &= \sum_{\sigma \in A} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)) x_1^{\#\sigma^{-1}(1)} \cdots x_r^{\#\sigma^{-1}(r)} \end{align*}$$ where $A = \bigcup_{n_1 + \cdots + n_r = n} A(n_1, \ldots, n_r)$. The last equality is due to the fact that the $A(n_1, \ldots, n_r)$ are pairwise disjoint. Notice that $A$ is actually the set of all functions from $\{0, \ldots, n-1\}$ to $I$. This is because if $\sigma$ is such a function, then setting $n_i = \# \sigma^{-1}(i)$ we have $\sum_i n_i = \# \sigma^{-1}(I) = \# \{0, \ldots, n-1 \} = n$ and clearly $\sigma \in A(n_1, \ldots, n_r)$. On the other hand, we have $x_1^{\#\sigma^{-1}(1)} \cdots x_r^{\#\sigma^{-1}(r)} = x_{\sigma(0)}x_{\sigma(1)} \cdots x_{\sigma(n-1)}$ (by counting the number of times that each of the $x_i$ appears in both terms). Therefore $$ \begin{align*}F_n(x_1, \ldots, x_n) = \sum_{\sigma \colon \{0, \ldots, n-1\} \to I} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)) x_{\sigma(0)}x_{\sigma(1)} \cdots x_{\sigma(n-1)}.\end{align*}$$ This can be rewritten as $$F_n(x_1, \ldots, x_n) = \sum_{i_0, \ldots, i_{n-1} \in I} \alpha_{i_0}x_{i_0}p(i_0,i_1)x_{i_1}\cdots p(i_{n-2}, i_{n-1})x_{i_{n-1}}.$$ Finally, if you develop the RHS you get the same expression which proves the desired equality.