Linear-time sampling of stochastic processes?

177 Views Asked by At

Are there any stochastic processes $(X_t)_{t \in \mathbb{R}^d}$ such that

  1. almost surely paths are continuous but nowhere differentiable and
  2. sampling of $n$ points $X_{t_n}$ on a path can be done in $O(n)$ time?

Most sampling techniques I have in mind require at least $O(n \log n)$ time.

On the the other hand, all linear-time samplers I know, create processes such that 1) fails; like Perlin-noise, White noise, Pink noise, etc.

Is it even theoretically possible? So this might be a very fundamental question.

2

There are 2 best solutions below

0
On

Let $(X_i)_{i\in \mathbb{N}_0}$ be a sequence of iid standard Normal distributed numbers. In dimension one define $$B_t^H = X_0 t + \sum_{k=0}^\infty X_{2^k+\lfloor 2^k t\rfloor}\frac{|2^k t-\lfloor 2^k t + \tfrac{1}{2}\rfloor|}{(2^k)^H}, \ t\in [0,1].$$ For $H=\tfrac{1}{2}$ this is the standard Brownian motion and its paths are continuous but nowhere differentiable (almost surely).

When $t$ is a dyadic rational ($t \in \{\tfrac{i}{2^m} : i,m \in \mathbb{N}_0\}$) the above series has only $m$ summands.

That means, sampling at these positions can be done in linear time. As this is a dense subset and numbers in a computer are represented anyway by binaries this should do the trick.

In dimension two take $(X_{i,j})_{i,j\in \mathbb{N}_0}$ and define $$B_{s,t}^H = X_{0,0} \tfrac{s+t}{2} + \sum_{k=0}^\infty \frac{(\tilde X +\dot X) a(2^k s,2^k t)+(\tilde X -\dot X)b(2^k s,2^k t)}{(2^k)^H}$$ for $ \ (s,t)\in [0,1]^2$, where $$\tilde X = X_{2^k+\lfloor 2^k s\rfloor,2^k+\lfloor 2^k t\rfloor},\\ \dot X = X_{2^k+(\lfloor 2^k s\rfloor+ c(2^k s,2^k t)) \text{mod}\, 2^k,2^k+ (\lfloor 2^k t\rfloor+ c(2^k t,2^k s)) \text{mod}\, 2^k},\\ c(x,y)=\tfrac{\text{sign}(\{x\}+\{y\}-1)+\text{sign}(\{x\}-\{y\})}{2},\\ a(x,y)=1 -||\{x\}+\{y\}-1|-|\{x\}-\{y\}||,\\ b(x,y)=1- |\{x\}+\{y\}-1|-|\{x\}-\{y\}| $$ and $\{\cdot\}$ denotes the fractional part.

0
On

I believe the Ornstein-Uhlenbeck process has these properties. This is based on another answer on this site about simulating the OU process I read a while back, which unfortunately I cannot find at the moment.

If $dX_t = -\theta X_t dt + \sigma dW_t$, with $X_0=0$, then $X$ is a Gaussian process with covariance function $\Gamma(s,t)=\frac{\sigma^2}{2\theta}e^{-\theta|t-s|}$. To sample $X_{t_{i+1}}$ from $X_{t_i}$, we will use the fact that the Ornstein-Uhlenbeck process has a stationary distribution, $X_t \sim N(0,\frac{\sigma^2}{2\theta})$ for all $t$. Let $(Z_i)$ be a sequence of i.i.d. $N(0,\frac{\sigma^2}{2\theta})$ random variables, define $\rho_i := \frac{\sigma^2}{2\theta}e^{-\theta(t_{i+1}-t_i)}$, and set $X_{t_{i+1}} := \rho_i X_{t_i} + \sqrt{1-\rho_i^2}Z_i$. We can compute $(X_{t_i},X_{t_{i+1}})$ has the correct joint distribution, and because of the structure of $\Gamma$ we can show that $(X_{t_1},X_{t_2},...,X_{t_n})$ also has the correct joint distribution. Hence this gives a way to sample $n$ points from a path of $X$. Since the $X_{t_i}$ are generated through a simple linear recurrence, sampling all $n$ points takes only $O(n)$ time.