Beginning to learn a bit of probability theory and I'm a little confused with this one. I know that given a Standard Normal variable X
$X \sim \mathcal{N}(0, 1)$, if $Y = X + c$ then $Y \sim \mathcal{N}(c, \sigma^2)$
Now if,
$Y = aX+b $, is $Y \sim \mathcal{N}($b/a$, \sigma^2)$?
If you don't remember the exact distribution for $Y=a+bX$, you can always derive it. First, suppose $b>0$. Then, \begin{aligned} \Pr[Y\leq y]&=\Pr[a+bX\leq y]\\ &=\Pr[X\leq (y-a)/b]\\ &=\int_{-\infty}^{(y-a)/b}\frac{1}{\sqrt{2\pi}}\exp\left[-\frac{1}{2}x^2\right]dx \end{aligned} You desire the upper limit to be $y$ so make the substitution $u=a+bx$. As $x$ varies between $-\infty$ and $(y-a)/b$, $u$ varies between $-\infty$ and $y$ (this is where we use $b>0$). We have $$ \Pr[Y\leq y]=\int_{-\infty}^y\frac{1}{b\sqrt{2\pi}}\exp\left[-\frac{1}{2}\frac{(u-a)^2}{b^2}\right]du. $$ You then can recognize the integrand as the PDF of $N(a,b^2)$.
If $b<0$, then $Y=a+bX=a+(-b)Z$, where $Z=-X$ is also $N(0,1)$ (why?). By the analysis above, $Y\sim N(a,(-b)^2)=N(a,b^2)$.
Finally, if $b=0$, then $Y$ is nonrandom and always equals $a$, which can be thought of as $N(a,0)=N(a,b^2)$.
So in all cases, you have $Y\sim N(a,b^2)$.