Dudley's integral convergence for stationary gaussian process

106 Views Asked by At

I am now studying "Gaussian processes" course at my university and i've faced a difficult problem for me: Imagine we have a Gaussian stationary process with expectation equal to zero, variance equal to $1$, correlation function of this process is monotonic in the right neighbourhood of zero and it looks like: $r(t) = 1 - |t|^\delta\cdot(1 + o(1))$, $\delta > 0$, $t \rightarrow 0$. The question is: does the Dudley's integral of this process converge? I have tried to build a Delone set for $(-\infty, \infty)$ using metric which is generated by this correlation function and after that my assumption is that this integral diverges, but i am really not sure if it is right.