Guess quadratic variation of 2 processes, with same/different BM.

123 Views Asked by At

I have 2 processes with stochastic parts R, S. $$dR = \mu_1 dt + \sigma_1 dW_{t1}$$ $$dS = \mu_2 dt + \sigma_2 dW_{t2}$$ I am trying to show what precisely quadratic variation $[R,S]$ means for 2 cases:

  1. $W_{t1} = W_{t2}$
  2. $W_{t1} \neq W_{t2}$

I know that by definition quadratic variation of function $f$ can be written as this: $$[f]=[f,f] = lim_{n \to \infty} \sum_{i=1}^{k_n} \left|f_{t_{i,n}}-f_{t_{i-1,n}}\right| \left|f_{t_{i,n}}-f_{t_{i-1,n}}\right|,$$ where we have super small partition of interval [a,b].

This notation I do not understand. $a=t_0^n<...<t_{i}^n<...<t_{k_n}^n =b$.

I do not understand why we need $n$ on top.

Using our formula we write: $$[R,S] = lim_{n \to \infty} \sum_{i=1}^{k_n} \left|R_{t_{i,n}}-R_{t_{i-1,n}}\right| \left|S_{t_{i,n}}-S_{t_{i-1,n}}\right|,$$ from: $$\int_{0}^{t_{i}} dR = \int_{0}^{t_{i}}\mu_1 dt +\int_{0}^{t_{i}} \sigma_1 dW_{t1}$$ we get in general case: $$[R,S] = lim_{n \to \infty} \sum_{i=1}^{k_n} \left( \int_{{t_{i-1}}}^{t_{i}}\mu_1 dt +\int_{{t_{i-1}}}^{t_{i}} \sigma_1 dW_{t1} \right) \left( \int_{{t_{i-1}}}^{t_{i}}\mu_2 dt +\int_{{t_{i-1}}}^{t_{i}} \sigma_2 dW_{t2} \right)$$

People neglect $\int_{{t_{i-1}}}^{t_{i}}\mu_1 dt * \int_{{t_{i-1}}}^{t_{i}}\mu_2 dt$, but I can not understand how can they do this, even if partition is small. It is usually said that $f_{t_{i,n}}-f_{t_{i-1,n}}$ is bounded by some constant $M$, and then using the fact that partition is small we neglect this difference. Same stuff happens with product of $\int_{{t_{i-1}}}^{t_{i}}\mu_1 dt$ and each of stochastic parts: $\int_{{t_{i-1}}}^{t_{i}}\mu_1 dt * \int_{0}^{t_{i}} \sigma_1 dW_{t1} \to 0 $ and $\int_{{t_{i-1}}}^{t_{i}}\mu_1 dt * \int_{0}^{t_{i}} \sigma_2 dW_{t2} \to 0 $ .

So in finish we can get: $$[R,S] = lim_{n \to \infty} \sum_{i=1}^{k_n} \int_{{t_{i-1}}}^{t_{i}} \sigma_1 dW_{t1} \int_{{t_{i-2}}}^{t_{i}} \sigma_2 dW_{t2}$$

One more question: Guess $[R,S]$. Does anybody has an idea what does "Guess quadratic variation" mean?

I only can think of taking expectation of $[R,S]$, then in case of different BM we get $0$, and in case of the same BM we get just $lim_{n \to \infty} \sum_{i=1}^{k_n} \int_{{t_{i-1}}}^{t_{i}}\sigma_1\sigma_2 dt$.

Would somebody recommend rigorous and easy for understanding book about this? I am using Shreve for now, I like it, but I think maybe there are some other good options. I really want to understand all this material. Thank you.