Gaussian with mean equal to another random variable equal to that random variable plus noise - proof?

269 Views Asked by At

Consider a Gaussian whose mean is the realization of another random variable: $p(x|y) = N(x; y, 1)$. I believe this is equivalent to saying $X \sim Y + N(0, 1)$.

Intuitively, this makes sense. If we first sample from Y and then choose a point near that by centering a Gaussian on it, it seems sensible that this is equivalent to choosing Y and then adding some noise.

In python, I tried simulating this by sampling many points from a triangular distribution (y) and then either adding standard normal noise to it ($x_2$) or using it as the mean of another normal distribution, and then sampling form that ($x$). As you can see, the histograms indicate that they are indeed equivalent.

enter image description here

However, I am having trouble seeing how to prove this statement. Does anybody have guidance or a resource that proves this?

Thank you!

2

There are 2 best solutions below

1
On BEST ANSWER

\begin{align} p_X(x)&=\int_{-\infty}^\infty p_Y(y)p_{X|Y}(x|y)dy=\int_{-\infty}^\infty p_Y(y)\frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2}(x-y)^2}dy \\&=(p_Y*\mathcal{N})(x) \end{align}

where $*$ denotes the convolution, and $\mathcal{N}(x)=\frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2}x^2}$. Therefore, since in general $Z+W=U\iff p_U(u)=(p_Z*p_W)(u)$, we have that $X=N(0,1)+Y$.

2
On

In general for the normal distribution, if $Z\sim N(0,\sigma^2)$, then $X = Z+\mu ~ \sim N(\mu, \sigma^2)$.

To see this, note the Characteristic Function: $E[e^{i t Z}] = e^{- \frac{1}{2}\sigma^2t^2}$. So $E[e^{i t X}] = E[e^{i t (\mu + Z)}] = e^{i t \mu - \frac{1}{2}\sigma^2t^2}$, which is the characteristic function for $N(\mu, \sigma^2)$.

With the same approach you can also prove that if $Y\sim N(0,1)$, then $\mu + \sigma Y \sim N(\mu, \sigma^2)$. (Exercise.)