Good day every one, I am trying to solve a particular problem relating to probabilities:
I have a time series dataset, where the price at time $t_i$ can be given as $x(t_i)$
We can then define a returns time series where $r_i = \ln[x(t_{i+1})/x(t_i)]$
I now want to find some timescale $t_M$ for which the returns could be viewed as a Markov process. For this, we require the Chapman-Kolmogorov (CM) equation $$p(r_2, t_2|r_1, t_1) = \int dr' p(r_2, t_2|r', t')p(r', t'|r_1, t_1) $$ to hold where p represents the usual conditional probability which can be given as
$$p(r_i, t_i|r_j, t_j) = \frac{p(r_i, t_i;r_j, t_j) }{p(r_j, t_j) } $$
We can reformulate the CM equation as the value $$S= |(p(r_2, t_2|r_1, t_1)- \int dr' p(r_2, t_2|r', t')p(r', t'|r_1, t_1) |$$
For some given $r_1$ and $r_2$ in terms of $t'-t_1$ for example. We then expect $t_M = t'-t_1$ for a value of $t'-t_1$ where S vanishes or reaches its minimum.
From what I have read, this value S, the CM conditions and subsequently the conditional probabilities can be computed numerically. So much so, that all the papers I have read make it seem like it a trivial matter. But I cannot find a way to do it.
Does anyone know how to compute such values numerically from a given dataset?
Thank you for your help .