Hello, so I think I did this correctly, but I just want to make sure. For the expected value, I just did the linearity of expectation which is $E[Y] = aE[X] + b = aμ + b$.
That deduces to $b = 0$
Now for the variance, I used the well-known variance formula $(+)=2$. So that is $1 = a^2$ (1) which means a would have to be 1.
I know this may be simple, but I am brand new to stats, so I need to get my fundamentals down

An easy way to solve the problem is to Standardize $X$. Thus the r.v.
$$ \bbox[5px,border:2px solid black] { Y=\frac{X-\mu_X}{\sigma_X} \qquad (1) } $$
is distributed with mean zero and variance 1.
It is easy to look at (1) in the following way
$$ \bbox[5px,border:2px solid black] { Y=\frac{1}{\sigma_X}\cdot X-\frac{\mu_X}{\sigma_X}=aX+b \ } $$
thus
$$a=\frac{1}{\sigma_X}$$
and
$$b=-\frac{\mu_X}{\sigma_X}$$
In my opinion this is the best way you can solve your problem in a Statistical way of thinking