How to solve this problem related to normal distribution?

36 Views Asked by At

Given two independent and identically distributed Gaussian random variables $X$ and $Y$. How to calculate the probability that these two random variables are not equal for a period of time $T$?

Here, at any time point $t$, the random variables $X$ and $Y$ are Gaussian distributed.

Is this related to stochastic process?

1

There are 1 best solutions below

4
On BEST ANSWER

OK, I'm gonna make a stab at answering this not-so-clearly formulated question. I'll probably mess it up, and someone who knows more about probability than I do will mock me, but... instead of doing that, they can write their own answers.

I'm going to simplify by assuming the mean is $0$ and the variance is $1$, so that the pdf is $$ s(x) = \frac{1}{\sqrt{2\pi}}\exp(-\frac{x}{2}). $$

The domain for the random variable $X$ is $$ \Bbb R \times [0, T] $$ which is also the domain for $Y$. We therefore can consider $$ U = \Bbb R \times \Bbb R \times [0, T] $$ as a probability space, with probability measure $$r(x, y, t) = s(x) s(y) \frac{1}{T}.$$

We can also define $$ h: U \to \Bbb R: (x, y, t) \mapsto \begin{cases} 1 & x = y \\ 0 & x \ne y \end{cases}. $$ We'd like to find the probability of the event $E = \{ (x,y,t) \mid h(x, y, t) = 1 \}$. That's the probability that at any time between $0$ and $T$, the two random variables are equal.

That probability is, by definition, just \begin{align} p &= \int_{0}^T \int_{-\infty}^\infty \int_{-\infty}^\infty h(x, y, t) r(x, y, t) dx ~ dy ~ dt \\ &= \int_{0}^T \int_{-\infty}^\infty \int_{-\infty}^\infty h(x, y, t) s(x) s(y) \frac{1}{T} dx ~ dy ~ dt \\ &= \int_{0}^T \frac{1}{T} \left(\int_{-\infty}^\infty \int_{-\infty}^\infty h(x, y, t) s(x) s(y) dx ~ dy\right) ~ dt \\ &= \int_{0}^T \frac{1}{T}~dt \left(\int_{-\infty}^\infty \int_{-\infty}^\infty h(x, y, 0) s(x) s(y) dx ~ dy\right) \\ &= \int_{-\infty}^\infty \int_{-\infty}^\infty h(x, y, 0) s(x) s(y) dx ~ dy \\ &= \int_{-\infty}^\infty \left(\int_{-\infty}^\infty h(x, y, 0) s(x) dx\right) s(y) ~ dy \\ \end{align} where the substitution of $0$ for $t$ is valid because $h$ is independent of its third argument. Looking at that last integral, let's fix $y$ for the moment, and look at the inner integral. It is $$ I = \int_{-\infty}^\infty h(x, y, 0) s(x) ~ dx. $$ The integrand is $s(x)$ only at the single point where $x = y$; otherwise it's equal to $0$. If two functions agree at all but a single point, their integrals are equal, so we have \begin{align} p &= \int_{0}^T \int_{-\infty}^\infty \int_{-\infty}^\infty h(x, y, t) r(x, y, t) dx ~ dy ~ dt \\ &= \int_{-\infty}^\infty \left(\int_{-\infty}^\infty h(x, y, 0) s(x) dx\right) s(y) ~ dy \\ &= \int_{-\infty}^\infty \left(\int_{-\infty}^\infty 0 dx\right) s(y) ~ dy \\ &= \int_{-\infty}^\infty 0 ~ dy \\ &= 0. \end{align}

In short: the probability (with respect to the probability measure described above) that $X(t) = Y(t)$ at some time $t$ with $0 \le t \le T$ is zero. So the probability that they're different throughout the given time interval is one. (This doesn't mean that they'll always be different -- merely that this happening is a probability-one event.)