I am trying to show that a set of numbers distributed from a Gaussian distribution {g1, g2,...,gn} with a standard deviation of 1 and a mean of zero can be transformed to a set of Gaussian numbers {G1, G2,...,Gn} that have a standard deviation of s and a mean of m by the formula:
Gi = s * gi + m (ignoring any normalization constants)
Despite my best efforts I cannot seem to find a way to prove this well known relationship true.
Since $g_i$ has cdf $\Phi(x)$, the cdf of $G_i$ is $P(G_i\le x)=P(g_i\le \frac{x-m}{s})=\Phi(\frac{x-m}{s})$. Differentiating gives the cdf of $G_i$, viz. $\frac{1}{s}\phi(\frac{x-m}{s})=\frac{1}{s\sqrt{2\pi}}\exp -\frac{1}{2}(\frac{x-m}{s})^2$ as expected.