Can someone tell me how the pdf of noise (w) is equivalent to the conditional pdf of observations (x) given A, assuming noise is independent of A for the equation x[n]=A+w[n] where A is the mean (and a random variable)?
This is done in order to determine the posterior pdf using Bayes' rule.
Weird notations...
One has the following fact. If $W$ and $A$ are independent random variables, and if $X = h(W, A)$ for some measurable function $h$, then the conditional distribution of $X$ given $A = a$ is the distribution of the random variable $h(W, a)$ (I should talk about regular conditional distribution to be rigorous). Here $h(w, a) = w + a$. Now it is easy to see that if $f_W$ is the pdf of $W$, then the random variable $Y := W + a$ has pdf $f_Y$ given by $f_Y(y) = f_W(y - a)$.