understanding a continuous time stochastic system from a discrete one

71 Views Asked by At

There are some discrete-time stochastic systems with some input that have been considered. For example

$X_{n+1}=F(X_n, U_n)$

where $\{X_n\}$ is some stochastic process, let's say the Markov chain on $\mathbb R$ and $U_n$ is a sequence of inputs in $\mathbb R$, where $F$ is a non-linear function. Could anyone give me some references about what would be a continuous-time version of the system? Would it be

$\dot X_t=F(X_t,U_t)$,

where $X_t$ is a continuous time Markov chain on $\mathbb R$? If yes, how to understand the $\dot X_t= \frac{d X_t}{dt}$ part? Can we easily define derivative of a continuous time Markov chain? Thanks for any help.

1

There are 1 best solutions below

1
On BEST ANSWER

Most continuous stochastic processes are not differentiable, and in fact are nowhere differentiable. While we cannot define the derivative of most continuous stochastic processes, we can define an integral with respect to some of them. The question of what the continuous-time version of a system is depends on the particulars of the system. For one example, we could have something like $$dX_t = F(X_t,U_t)dt + G(X_t,U_t)dW_t$$ where $W$ is a Brownian motion. The notation is a little confusing since, while this does describe how $X_t$ changes over time, it's shorthand for an integral rather than a derivative. This example would mean $X_t = X_0 + \int_0^t F(X_s,U_s)ds + \int_0^t G(X_s,U_s)dW_s$.

For references, you might look at Oksendal's Stochastic Differential Equations, Revuz and Yor's Continuous Martingales and Brownian Motion, or Karatzas and Shreve's Brownian Motion and Stochastic Calculus.