In a solution to a probability exercise, there was the following claim I couldn't understand:
Consider a random vector (signal) $y=x+w$ where $w$ is a noise random vector (signal) with energy $\lVert w\rVert_2<\epsilon$, where $\epsilon >0$ and we assume having the true probability density function (pdf) $f_x(x)$ of the random vector $x$, then $f_x(x)\ge f_x(y)$.
Can anyone help? I looked into functions of two random variables but didn't help since here we are comparing two $f_x(\cdot)$ and not $f_y(y)$ against $f_x(x)$ and $f_w(w)$.
Given a signal of amplitude X, what is the effect on the distribution of the amplitude when noise of amplitude W is added? Let Y=X+W be the observed signal amplitude. Define $f_X(x)$ as the probability density of X and $f_Y(x)$ as the probability density of Y. Let $f_X(u)$ have its maximum at v, then $f_X(v)\gt f_Y(v)$. The point is that the noise spreads the distribution of the received signal so that it is not as sharp as the signal without noise.