If $X,Y$ are random variables and $Y\sim\mathcal N(x,\sigma^2)$ if $X=x$, are we able to conclude $Y-X\sim\mathcal N(0,\sigma^2)$?

86 Views Asked by At

Let $\sigma>0$ and $$Q(x,\;\cdot\;):=\mathcal N(x,\sigma^2)\;\;\;\text{for }x\in\mathbb R.$$ Note that $Q$ is a Markov kernel on $(\mathbb R,\mathcal B(\mathbb R))$.

Now, let $X,Y$ be real-valued random variables on a common probability space $(\Omega,\mathcal A,\operatorname P)$. Let $X_\ast\operatorname P$ and $Y_\ast\operatorname P$ denote the distribution of $X$ and $Y$, respectively, and asume that $$Y_\ast\operatorname P=(X_\ast\operatorname P)Q,$$ where the right-hand side denotes the composition of $X_\ast\operatorname P$ and $Q$.

Are we able to show that $(Y-X)_\ast\operatorname P=\mathcal N(0,\sigma^2)$?

Intuitively, if $X=x\in\mathbb R$, then $Y\sim\mathcal N(x,\sigma^2)$ and it's a well-known fact that $Y-x\sim\mathcal N(0,\sigma^2)$.

2

There are 2 best solutions below

0
On BEST ANSWER

From the mentioned property of the normal distribution, we know that $$\int 1_B(y-x)\:\mathcal N(x,\sigma^2)({\rm d}y)=\mathcal N(0,\sigma^2)(B)\;\;\;\text{for all }(x,B)\in\mathbb R\times\mathcal B(\mathbb R)\tag1.$$ Now, it's easy to see that $$\operatorname P\left[(X,Y)\in\;\cdot\;\right]=X_\ast\operatorname P\otimes\:Q\tag2$$ and hence \begin{equation}\begin{split}\operatorname P\left[Y-X\in B\right]&=\int\operatorname P\left[X\in{\rm d}x\right]\int Q(x,{\rm d}y)1_B(y-x)\\&=\int\operatorname P\left[X\in{\rm d}x\right]\int\mathcal N(0,\sigma^2)({\rm d}y)1_B(y)=\mathcal N(0,\sigma^2)(B)\end{split}\tag3\end{equation} for all $B\in\mathcal B(\mathbb R)$.

2
On

Yes : We can prove that $ Y-X \sim \mathcal N( 0, \sigma^2)$.

Proof : To see it, the key is to write law of $Y-X$ using the expectation of the conditional probability knowing $X$:

$$P(Y-X < t) = E( P( Y -X < t | X ) ) $$

Then, notice that $P( Y - X < t | X )$ (which is a function of X) actually does not depend on $X$ :

Let $N_a$ denote a Gaussian variable $N_a \sim \mathcal N( a, \sigma^2)$ (not related to $X$ and $Y$). $$P( Y - X< t | X = x_0 ) = P( Y - x_0 < t | X = x_0 ) $$ $$ = P( N_{x_0} - x_0 < t )$$ $$ = P( N_0 < t ) $$ This last term is constant (i.e. does not depend on $x_0$), therefore its expectation is the same.

$$ E( P( N_0 < t ) ) = P( N_0 < t ) $$

We just have proven that $$P(Y-X < t) = P( N_0 < t ) $$

Therefore, $$ Y - X \sim N_0 $$