I want to determine the generating function of the numbers of visits up to a specific time. Formally: let $\{X_{n}\}_{n \geq 0}$ be a markov chain on the state space $S = \{1,\ldots,r\}$ with transition matrix $P = (p_{i,j})_{1 \leq i,j \leq r}$ and initial probability vector $\alpha = (\alpha_{i} : i \in S)$ with $\alpha_{i} = \mathbb{P}(X_{0} = i)$. Let $q(n_{1},\ldots,n_{r})$ be the probability that up to time $n-1$ the state $1$ is visited $n_{1}$-times,$\ldots$,state $r$ is visited $n_{r}$-times and $F_{n}(x_{1},\ldots,x_{r}) = \sum_{n_{1}+ \cdots +n_{r} = n}q(n_{1},\ldots,n_{r})x_{1}^{n_{1}}\cdots x_{r}^{n_{r}}$ be the corresponding generating function of $q(n_{1},\ldots,n_{r})$. Now I want to show that $$ F_{n}(x_{1},\ldots,x_{r}) = (\alpha_{1}x_{1},\ldots,\alpha_{r}x_{r}) \left( P D_{r}\right)^{n-1} e, $$ where $e = (1,\ldots,1)^{\intercal}$ and $D$ is given by $$ D = \begin{bmatrix} x_{1} && \\ & \ddots &\\ & & x_{r} \end{bmatrix}. $$ Here is my solution: $$ \begin{align*} \sum_{n_{1} + \cdots + n_{r} = n}q(n_{1},\ldots,n_{r}) &= \sum_{j \in I}\mathbb{P}(X_{n-1} = j) \\ &= \sum_{j \in I}\sum_{i \in I}\alpha_{i}p_{i,j}^{n-1} \\ &= \left(\sum_{i \in I}\alpha_{i}p_{i,1}^{n-1},\ldots,\sum_{i \in I}\alpha_{i}p_{i,r}^{n-1} \right) e \\ &= \alpha P^{(n-1)} e. \end{align*} $$ Does this suffice as a proof and is it correct? Thanks for any help!
2026-03-27 23:36:01.1774654561
generating function of numbers of visits up to time n
104 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in PROOF-VERIFICATION
- how is my proof on equinumerous sets
- Existence of a denumerble partition.
- Confirmation of Proof: $\forall n \in \mathbb{N}, \ \pi (n) \geqslant \frac{\log n}{2\log 2}$
- Calculating probabilities using Markov chains.
- Solution to a hard inequality
- Given a function, prove that it's injective
- Is the following set open/closed/compact in the metric space?
- Surjective function proof
- Possible Error in Dedekind Construction of Stillwell's Book
- Proving dual convex cone property
Related Questions in MARKOV-CHAINS
- Calculating probabilities using Markov chains.
- Probability being in the same state
- Random walk on $\mathbb{Z}^2$
- Polya's Urn and Conditional Independence
- Markov Chain never reaches a state
- Finding a mixture of 1st and 0'th order Markov models that is closest to an empirical distribution
- Find probability function of random walk, stochastic processes
- Generating cycles on a strongly connected graph
- Will be this random walk a Markov chain?
- An irreducible Markov chain cannot have an absorbing state
Related Questions in GENERATING-FUNCTIONS
- The function $f(x)=$ ${b^mx^m}\over(1-bx)^{m+1}$ is a generating function of the sequence $\{a_n\}$. Find the coefficient of $x^n$
- How to multiply generating functions with $x^n$ and $x^{5n}$ and $x^{2n}$
- Relationship between the generating functions of sequences $(a_n),(b_n)$ given $b_n=\sum^n_{i=1}a_i$.
- Double-exponential sum (maybe it telescopes?)
- Solve recurrence equation: $a_{n}=(n-1)(a_{n-1}+a_{n-2})$
- Want to use Herbert Wilf's snake oil method to show $\sum_k \binom{2n+1}{2k}\binom{m+k}{2n} = \binom{2m+1}{2n}$
- Young Tableaux generating function
- Generating function of the sequence $\binom{2n}{n}^3H_n$
- Expansion of fibonacci generating function
- Partial fraction of $A(x)=\frac{x^2+x+1}{(1-x)^3}$
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Your computation is correct but it only proves the special case $x_1 = \ldots = x_r = 1$. To prove the general case, set $$A(n_1, \ldots, n_r) = \left\{\sigma \colon \{0, \ldots, n-1\} \to I \; \middle| \; \#\sigma^{-1}(i) = n_i\right\}.$$ Then $$\begin{align*} q(n_1,\ldots, n_r) &= \mathbb{P}\big(\exists \sigma \in A(n_1, \ldots, n_r), \, X_k = \sigma(k) \text{ for } 0 \leq k \leq n-1 \big) \\ &= \sum_{\sigma \in A(n_1, \ldots, n_r)} \mathbb{P}\big( X_k = \sigma(k) \text{ for } 0 \leq k \leq n-1\big) \\ &= \sum_{\sigma \in A(n_1, \ldots, n_r)} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)). \end{align*}$$ Now we get $$\begin{align*} F_n(x_1, \ldots, x_n) &= \sum_{n_1 + \ldots + n_r = n}q(n_1,\ldots, n_r) x_1^{n_1} \cdots x_r^{n_r} \\ &= \sum_{n_1 + \ldots + n_r = n} \bigg(\sum_{\sigma \in A(n_1, \ldots, n_r)} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)) \bigg)x_1^{n_1} \cdots x_r^{n_r} \\ &= \sum_{n_1 + \ldots + n_r = n \\ \sigma \in A(n_1, \ldots, n_r)} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)) x_1^{\#\sigma^{-1}(1)} \cdots x_r^{\#\sigma^{-1}(r)} \\ &= \sum_{\sigma \in A} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)) x_1^{\#\sigma^{-1}(1)} \cdots x_r^{\#\sigma^{-1}(r)} \end{align*}$$ where $A = \bigcup_{n_1 + \cdots + n_r = n} A(n_1, \ldots, n_r)$. The last equality is due to the fact that the $A(n_1, \ldots, n_r)$ are pairwise disjoint. Notice that $A$ is actually the set of all functions from $\{0, \ldots, n-1\}$ to $I$. This is because if $\sigma$ is such a function, then setting $n_i = \# \sigma^{-1}(i)$ we have $\sum_i n_i = \# \sigma^{-1}(I) = \# \{0, \ldots, n-1 \} = n$ and clearly $\sigma \in A(n_1, \ldots, n_r)$. On the other hand, we have $x_1^{\#\sigma^{-1}(1)} \cdots x_r^{\#\sigma^{-1}(r)} = x_{\sigma(0)}x_{\sigma(1)} \cdots x_{\sigma(n-1)}$ (by counting the number of times that each of the $x_i$ appears in both terms). Therefore $$ \begin{align*}F_n(x_1, \ldots, x_n) = \sum_{\sigma \colon \{0, \ldots, n-1\} \to I} \alpha_{\sigma(0)}p(\sigma(0),\sigma(1))\cdots p(\sigma(n-2), \sigma(n-1)) x_{\sigma(0)}x_{\sigma(1)} \cdots x_{\sigma(n-1)}.\end{align*}$$ This can be rewritten as $$F_n(x_1, \ldots, x_n) = \sum_{i_0, \ldots, i_{n-1} \in I} \alpha_{i_0}x_{i_0}p(i_0,i_1)x_{i_1}\cdots p(i_{n-2}, i_{n-1})x_{i_{n-1}}.$$ Finally, if you develop the RHS you get the same expression which proves the desired equality.