What is the average of two stochastic processes multiplied?

53 Views Asked by At

Consider two random processes $X(t)$ and $Y(t)$ for which $$\langle X(t) X(t') \rangle = \mu_X^2 + \sigma_X^2 \delta(t-t')$$ $$\langle Y(t) Y(t') \rangle = \mu_Y^2 + \sigma_Y^2 \delta(t-t')$$ ie. the processes are delta-correlated in time but have a non-zero mean. $X$ and $Y$ are independent processes.

My questions is: what is $$\langle X(t) X(t') Y(t) Y(t') \rangle$$ ?

Since they are independent, I should be able to do $$\langle X(t) X(t') Y(t) Y(t') \rangle = \langle X(t) X(t')\rangle \langle Y(t) Y(t') \rangle \\ = \mu_X^2\mu_Y^2 + \mu_X^2\ \sigma_Y^2\delta(t-t) + \mu_Y^2\ \sigma_X^2\delta(t-t) + \sigma_X^2 \sigma_Y^2 (\delta(t-t))^2 $$ but I don't know how to make sense of the squared delta function... How do I use that in an integral such as $$\int_0^T \int_0^T (\cdot) \, dt dt'$$ ?

Thank you in advance. Any help will be much appreciated.

If it makes a difference you can assume both processes Gaussian.

1

There are 1 best solutions below

0
On

It might help to consider the correlations as being piecewise defined: $$ \langle X(t) X(t') \rangle = \mu_X^2 ~~ \textrm{if} ~~ t \neq t', \\ \langle X(t) X(t') \rangle = \mu_X^2 + \sigma_X^2 ~~ \textrm{if} ~~ t = t',$$ and similarly for $Y$. In this way you can calculate $$ \langle X(t) X(t')\rangle \langle Y(t)Y(t') \rangle = \mu_X^2 \mu_Y^2 ~~ \textrm{if} ~~ t \neq t', ~~\textrm{but} \\ \langle X(t) X(t') \rangle\langle Y(t)Y(t') \rangle = (\mu_X^2 + \sigma_X^2)(\mu_Y^2 + \sigma_Y^2) ~~ \textrm{if} ~~ t = t'.$$ If you then reintroduce your delta function to distinguish the two you find that you have the same solution as you suggested, just with the $\delta^2$ replaced by a $\delta$.

For this reason, in this context, you can assert $\delta^2 = \delta$.