When converting units of standard deviation, should I do the calculations in variance then convert the variance back?
For reference, I am trying to convert the standard deviation from fahrenheit to celsius. I have a standard deviation of 0.9 so I am doing $\frac{5}{9}*0.9^2$=0.45. Then I did $\sqrt45$ to get my new standard deviation; since the equation for converting is: $celsius=\frac{5}{9}(farenheit-32)$. But I think I am over complicating it and should just directly convert the standard deviation to celsius without converting to the variance.
The units of variance are the square of the units used for the distribution. You can verify that from the defining equation for variance. If you want to convert the variance of a number measured in $^\circ F$ to a number measured in $^\circ C$ you need to multiply by $\left(\frac 59\right)^2$. The units of standard deviation are the same as the units of the distribution, so if you are converting a standard deviation you multiply by $\frac 59$