Standard Deviation and data set increasing by multiplication.

176 Views Asked by At

Hi guys I recently started learning statistics and I'm a bit confused. I know when a data set is multiplied by a constant, its standard deviation and mean is multiplied by the same number.

However when its multiplied by a negative constant (x), the standard deviation is multiplied by $x$ and the mean by $- x$. Why is the standard deviation not multiplied by a negative number?