What is the logic behind writing this specific Normal random variable, which is conditional, as the sum of 2 other Normals?

59 Views Asked by At

Problem: Suppose Y is $ Normal (\mu = 1, \sigma^2 = 1)$ and, conditional on Y = y, X is $Normal (\mu = y, \sigma^2 = 4)$. We want to estimate $\theta = P(X > 1)$

Raw estimator

$\hat{\theta} = n^{-1}\Sigma_{n=1}^\infty W_i$, where $W = I(X > 1)$. Note that we can write X = Y + Z, where Y is $Normal (\mu = 1, \sigma^2 = 1)$, Z is $Normal (\mu = 0, \sigma^2 = 4)$ and independent of Y .

My main question

What is the logic behind writing X = Y + Z ?

I get that the sum of two Normal random variables is also Normal with the mean as the sum of the means and variance is the sum of the two variances but, in this case, I didn't understand how Z is figured out i.e. $Normal(\mu=0, \sigma^2 = 4)$

Also, I get that the variance of X is 4 when Y is fixed. But, I am told that when Y is random, then, the variance of Y is added to that of X. Any references to theorems or concepts that can point to the proofs or general rules related to this statement would also be helpful.

1

There are 1 best solutions below

0
On BEST ANSWER

Requested from comments

If $A \sim N(μ_A,σ^2_A)$ and $B\sim N(μ_B,σ^2_B)$ are independent

then $C=a+B \sim N(a+μ_B,σ^2_B)$

and $D=A+B\sim N(μ_A+μ_B,σ^2_A+σ^2_B)$.

This is essentially the same thing, with $μ_B=0$ and $Y=A, X=D, Z=B$.

The variance of $Z$ is $4$ (as is the conditional variance of $X$ given $Y$)

but the unconditional variance of $X$ is $1+4=5$.