Imagine we have two signals:
$ x_1[t]=\hat{x}_1[t]+N(0, \Sigma_1)$
$ x_2[t]=\hat{x}_2[t]+N(0, \Sigma_2)$
Where $x_i,~\hat{x}_i $ are the measured, true values, respectively, and $ N(0, \Sigma_i)$ is a Gaussian noise. The true values are changing over time and they are not following any specific type of distributions, however, we know that the signals are continuous. Given $T_k$ observations (which can be large) over time, I am wondering if there is any way to calculate the following probability:
$P(\|\hat{x}_1[t]-\hat{x}_2[t]\|<\delta,\mu,\Sigma)$
where $\mu=\frac{ \sum\limits_{i=0}^{T_k}\|{x}_1[t-i]-{x}_2[t-i]\|}{T_k}$ and $\Sigma=var(\|{x}_1[t-i]-{x}_2[t-i]\|)$ are the mean and the variance of $T_k$ observations. Any suggestion?