How to Prove by definition, the given process is a Markov Process?

234 Views Asked by At

Define the process Xt by X0 = 1, and for t = 1, 2, . . .

    Xt = {  uXt-1, with probability p,
         {  vXt-1, with probability 1-p

where 0 < v < 1 < u are prescribed numbers. The process Xt dened this way is a simple model for describing stock price movements; e.g. Xt represents the price of a stock at day t. The model states that at day t + 1 the price is (up to) uXt with probability p, or (down to) vXt with probability 1 - p.

The process {Xt} is a Markov chain. (The state space of this chain is not the usual one consisting of states 0, 1, 2, . . . , but the states may be renamed in this fashion). Use the definition of a Markov chain and explain that {Xt} is a Markov chain.

I want to represent it firstly in terms of traditional fashion, how to convert ? and then prove that it is a Markov process. kindly help me out?

1

There are 1 best solutions below

2
On BEST ANSWER

Initial distribution + transition probabilities:

  • Initial distribution: $P[X_0=1]=1$.
  • Transition probabilities: $Q(x,ux)=p$ for every $x\gt0$, $Q(x,vx)=1-p$ for every $x\gt0$ and $Q(x,y)=0$ for every $x\gt0$ and every $y\gt0$ not in $\{ux,vx\}$.

Iteration of random functions:

  • $X_0=1$ and $X_{n+1}=Y_{n+1}X_n$ for every $n\geqslant0$, where $(Y_n)_{n\geqslant1}$ is independent with common distribution $P[Y_n=u]=p$, $P[Y_n=v]=1-p$ for every $n$.

Conditional independence :

  • The sigma-algebra $\mathcal F^X_n$ of the past is $\mathcal F^X_n=\sigma(X_k,k\leqslant n)$. Since $X_{n+1}=Y_{n+1}X_n$ where $Y_{n+1}$ is independent of $\mathcal F^X_n$ and $X_n$ is $\mathcal F^X_n$-measurable, $X_{n+1}$ conditionally on $\mathcal F^X_n$ and $[X_n=x]$ is distributed like $Yx$, where $Y$ is any random variable such that $P[Y=u]=p$, $P[Y=v]=1-p$. This distribution depends on $\mathcal F^X_n$ only through $x$ hence $(X_n)$ is a Markov chain.