Variance of Time-Integrated Ornstein-Uhlenbeck Process

2.4k Views Asked by At

I'm attempting to filter white noise from a deterministic, finite-power signal using a low-pass filter. This filter can be described using an exponentially-decaying response function:

$$ h(t) = \gamma \exp (-\gamma t) $$

Evaluating the Wiener integral for the filtered noise results in a zero-mean Ornstein-Uhlenbeck process, whose variance is:

$$\sigma^2_{OU} =\frac{\gamma}{2}\left( 1-\exp(-2\gamma t)\right)$$

The task now is to find the variance of the time-integrated process. My question is whether the following procedure, based on Wiener integrals, is correct, and if not, what is the correct procedure?


First, we use the fact that Gaussian random variables are determined by their means and variances to express the zero-mean OU process as the product of a deterministic function and a Wiener process:

$$ \sigma^2_W = t \quad \therefore \quad \sigma^2_{F\times W} = F^2(t)\times t $$ $$ F(t) = \frac{\gamma}{2} \sqrt{\frac{1-\exp(-2\gamma t)}{t}} \quad \rightarrow \quad \sigma^2_{F\times W} = \sigma^2_{OU}$$

We perform integration by parts:

$$ \int_0^t W(\tau)F(\tau)d\tau = \left. \left[W(\tau)\int_{0}^{\tau}F(\tau ')d\tau ' \right] \right \vert_0^t - \int_0^t \left( \int_{0}^{\tau}F(\tau ')d\tau ' \right)dW(\tau)$$

Labelling $\int_{0}^{\tau}F(\tau ')d\tau '$ as $\tilde{F}(\tau)$, we write the variance of the integrated process as the sum of the variances of the two terms above. The variance of the left term is $ \tilde{F}^2(t)\times t$, since this is just the Wiener process, multiplied by a deterministic function. The term on the right is a Wiener integral, whose variance is $\int_0^t \tilde{F}^2(t) dt$. I'll leave $\tilde{F}(t)$ unspecified here, since the integral is difficult.


So, have I made some error in deriving the variance?

2

There are 2 best solutions below

2
On BEST ANSWER

An Ornstein-Uhlenbeck process is not the product of a Wiener process with a deterministic function. A Gaussian process $X(t)$ is not determined by its mean and variance for each $t$, but rather by its mean for each $t$ and by the covariance of $X(s)$ and $X(t)$ for each $s,t$.

0
On

(In what follows, I'll use autocorrelations instead of autocovariances, since the processes being discussed are zero-mean.)

When a random process is subjected to a linear, time-invariant system, the autocorrelation of the resulting process can be found using a convolution formula:

$$ R_{\textrm{out}}(t) = R_{\textrm{in}}(t) \ast r_{\textrm{sys}}(t) $$

Here, $r_{\textrm{sys}}(t) = h_{\textrm{sys}}(t) \ast h_{\textrm{sys}}(-t)$, where $h_{\textrm{sys}}(t)$ is the transfer function of the system, in the time domain.

Integration can be represented as such a system, as an integral is just a convolution with a unit box function:

$$\int_0^t X(\tau)d\tau = X(\tau) \ast B(\tau/t-1/2)$$

Here, the box function $B(x)$ is defined to be $0$ wherever $\vert x \vert > 1/2$, and $1$ elsewhere.

The variance of a process $X$, $\mathbb{E}(X^2(t))$ is $R_X(0)=\mathbb{E}(X(t)X(t+\tau))$ at $\tau=0$.

The Ornstein-Uhlenbeck process is exponentially correlated, $R_{OU}(t) = \exp(-\gamma \vert t \vert)$, convolution with $ \int_{-\infty}^{\infty}B(t'/t-1/2)B((t'+\tau)/t-1/2)dt'$ results in a variance of

$$ t + \dfrac{\exp(- \gamma t)-1}{ \gamma} $$

However, I don't know whether this is correct. Comments?