Definitions involving Markov chains

43 Views Asked by At

I'm learning Stochastic processes and Markov chains in a class meant for engineering students and as a mathematics undergrad I'm having a lot of trouble with the lack of precise definitions on this subject. I tried googling some of this stuff, but most websites I found didn't talk about this or where very imprecise about it.

Firstly: When we say that $X: T\times \Omega \to S$ is a discrete stochastic process, do we mean that the set $T$ is finite or countable, or do we mean that for any $t\in T$, the random variable $X(t,\cdot)$ is a discrete random variable?

Now, let $X$ be a Markov chain. From what I was able to piece together, this means that $X: T\times \Omega\to \mathbb R$ is a stochastic process with the Markov property, this is: for any $t_1<t_2<...<t_n<t_{n+1}$ in $T$, we have that $$\mathbb P(X(t_{n+1})=x|X(t_n)=x_n,...,X(t_1)=x_1)=\mathbb P(X(t_{n+1})=x|X(t_n)=x_n)$$

My Second question is: My teacher said that a stochastic process that satisfies the Markov property is a Markov property. So what is the difference between a Markov process and a Markov chain? My teacher defined them in the exact same way.

My third question is: What exactly are the states of the Markov chain? For all $t\in T$, $X(t)$ is a random variable in the probability space $(\Omega,\mathcal A,\mathbb P)$. So are the states of the elements of $\Omega$, or are they the union of all the images of $\Omega$ with respect to each random variable $X(t)$?

My last question is: The fact that we can write transition diagrams seems to imply that, for any $n_,m$ we have that $$\mathbb P(X(t_{n+1})=j|X(t_n)=i)=P_{ij}=\mathbb P(X(t_{m+1})=j|X(t_m)=i)$$ But I don't see how this follows from the Markov property. Why is this true?

1

There are 1 best solutions below

0
On BEST ANSWER

Here is a more precise definition of a discrete-time Markov chain. A discrete-time Markon chain (synonymous with Markov process) is first and foremost a sequence of random variables $(X_n)_{n \in \mathbb N}$. Thus, to answer your first question, $T$ is countable for Markov chains; hence we might as well use $\mathbb N$ for $T$. $(X_n)_{n \in \mathbb N}$ satisfies

  • each $X_n : \Omega \to \mathbb R$ shares the same probability space $(\Omega, \mathcal{A}, \mathbb P)$ and the same discrete support $R_{X_n} = S \subseteq \mathbb N$. This common support $S$ is called the state space.
  • the Markov property i.e. $$ \mathbb P[X_{n + 1} = x \ | \ X_n = x_n, \ldots, X_0 = x_0] = \mathbb P[X_{n + 1} = x \ | \ X_n = x_n] $$ for all $n \in \mathbb N$ and all $x, x_n, \ldots, x_0 \in S$.

You cannot in general prove $$ \mathbb P[X_{n + 1} = j \ |\ X_n = i] = \mathbb P[X_{m + 1} = j \ | \ X_m = i] $$ for all $n,m$ from the Markov property. Markov chains that satisfy this are called time-homogeneous and it is an additional property assumed for most elementary examples involving transition diagrams.