Martingale that is not a Markov process: one technical issue in existing examples.

166 Views Asked by At

My question is technically not new. However, I am a bit confused by certain setups. A thread appeared here:

Martingale that is not a Markov process

The spirit of the answer provided by Did is clear. One detail is not so clear to me, though. Let me simplify to a discrete version of that argument (for instance, see https://imathworks.com/math/math-stochastic-process-that-is-martingale-but-not-markov/):

  1. Pick random value for $_0; $

  2. Consider a sequence of of random variables $\{\epsilon_n: n \ge 0\} $ that are i.i.d. with mean $E[\epsilon_n]=0 $ AND independent of $_0 $ (Note here like in Did's example independence is only assumed with respect to $X_0; $

  3. Now we define are stochastic process: $X_{+1}=_+\epsilon_{n+1}_0; $

  4. Then we proceed to prove that the stochastic process $X=(X_n: n \ge 0 ) $ is a martingale. The proof goes along the lines: $$ [_{+1} | X_0,\ldots,_n]=[_|_0 \ldots,_]+ E[\epsilon_{n+1}| X_0, \ldots, X_n] $$ $$ =_ + [\epsilon_{+1}|_0, \ldots,_][_0|_0,\ldots, _] $$ $$ =_+E[\epsilon_{n+1}]X_0 = _+ 0\cdot X_0 = X_n. $$

and here is where I am not starting to get the point, the author justifies this last step by saying that "it is key that $\epsilon_n $ is independent of the $\{_i, 0\le i\le n\}. $

MY QUESTION: where does the fact that $\epsilon_n $ is independent of the $\{_i, 0\le i\le n\} $ come from when our initial assumption was that the $\epsilon_n$'s are independent of $X_0 $?

Once this is done then, yes, we have an example of a martingale that is not a Markov process.

The original example by Did, consisted in introducing a sequence $(Z_t: t\ge 2) $ of i.i.d. variables not identically null and with mean zero independent on $X_0 $ (I also wonder what is the exact meaning of being independent ON $X_0, $ actually).

Then, Did lets $_1=_0=1 $ and $_=_{−1} + Z__{−2} $ for every $t\ge 2. $

Then, his argument uses the fact that $$ E[X_t | F_{t-1}] = X_{t-1} $$ where I assume that $F_t $ is a member of the natural filtration for the process. He can do so using the argument that

$$ [_t X_{t-2}∣ X_{t-1}] = 0 $$ as a consequence of a general result that states that if $ $ is independent on $W, $ then $$ [∣]=[][∣]. $$

I find myself with the same issue. $Z_t $ plays the role of $\epsilon_{n+1} $ in the previous example and $Z_t $ would have to be $U $ in applying the result for conditional expectations, $X_{t-2} $ is $V $ and $W $ would have to be $X_{t-1}, $ right ? But, again, how does one say that $Z_t $ is independent of $X_{t-1} $ when we started we just saying that the $Z_t$'s were independent on $X_0 $?

1

There are 1 best solutions below

0
On

I think I found the answer on my own. Going back to the first version of the example it all comes down to consider the nature of how the $X_n$'s are constructed.

Let $X_0 $ be a random variable with finite expectation and consider a sequence of i.i.d. random variables $\{\epsilon_n: n \in N\} $ such that $E[\epsilon_n]= 0 $ for each $n $ and independent of $X_0. $ Then define the stochastic process $X=\{X_n: n \in N_0\} $ by letting $$ X_{n+1} = X_n + \epsilon_{n+1}X_0. $$ The process $X $ is now a martingale. In fact, if we introduce the usual filtration $\mathbb{F} = \{{\cal F}_n: n \in N_0\} $ and ${\cal F}_n = \sigma(X_0, X_1, \ldots, X_n), $ then \begin{align*} E[X_{n+1} \mid {\cal F}_n] &=E[ X_n + \epsilon_{n+1}X_0 \mid {\cal F}_n]\\ &= X_n + X_0 E[\epsilon_{n+1} \mid {\cal F}_n] = X_n + X_0 E[\epsilon_{n+1}] = X_n + 0 = X_n. \end{align*} We should notice that $X_n = (1+\epsilon_1+ \ldots + \epsilon_n)X_0 $ and, since by assumption the $\epsilon_n$'s are i.i.d. and independent of $X_0, $ $\epsilon_{n+1} $ is independent of ${\cal F}_n. $ In addition, since $X_0 $ has finite mean, so do all the $X_n$'s. Thus the process $X $ is indeed a martingale with respect to the natural filtration.