I have this statement:
A continuous random var $X \sim N(\mu, \sigma^2)$ where $\mu = 1.5, \sigma^2 = 0.1$
If $\mu$ increase in $20\%$, what is the new standard deviation?
My attempt was:
When the $\mu$ is multiply by some constant $k$, in this case $1.2$ means that:
$\mu\cdot k=k\cdot p(X=x_1)\cdot x_1+...+k\cdot P(X =x_n)\cdot x_n$
$\mu \cdot k =k[ p(X=x_1)\cdot x_1+...+ P(X =x_n) \cdot x_n]$
So, we are multiplying all the data by a factor $k$, and and using the property of:
If you multiply all the data by a constant $c$, the new standard deviation will be $\sigma \cdot c$, and the variance $\sigma^2 \cdot c^2$.
Thus, the new standard deviations is equal to $\sqrt{0.1}\cdot 1.2$
But the correct answer must be $1.2^2 \cdot 0.1$. So, what is wrong with my development?
You wrote $\mu$ increased in $20\%$ ,i.e, new mean is $1.5×1.2=1.8$ Let us consider another random variable $Y=1.2X$ . Note that $\mathbb{E}(Y)=1.8$ And $\mathbb{Var}(Y)=\mathbb{Var}(1.2X)=1.2^2\mathbb{Var}(X)=1.2^2×0.1$