Define the process Xt by X0 = 1, and for t = 1, 2, . . .
Xt = { uXt-1, with probability p,
{ vXt-1, with probability 1-p
where 0 < v < 1 < u are prescribed numbers. The process Xt dened this way is a simple model for describing stock price movements; e.g. Xt represents the price of a stock at day t. The model states that at day t + 1 the price is (up to) uXt with probability p, or (down to) vXt with probability 1 - p.
The process {Xt} is a Markov chain. (The state space of this chain is not the usual one consisting of states 0, 1, 2, . . . , but the states may be renamed in this fashion). Use the definition of a Markov chain and explain that {Xt} is a Markov chain.
I want to represent it firstly in terms of traditional fashion, how to convert ? and then prove that it is a Markov process. kindly help me out?
Initial distribution + transition probabilities:
Iteration of random functions:
Conditional independence :