In June, New York City's daily maximum temperature has a mean of 80 °F and standard deviation of 5°F. What is the absolute difference in mean and variance in °C?
I don't know what possible mistake I could be making in the following approach:
$$°F = \frac95 °C + 32 $$ $=> SD(°F) = \frac95 SD(°C)$ $=> Var(°F) = \frac{81}{25}Var(°C)$
$=> Mean(°F) = \frac95 Mean(°C) + 32 $
Solving using above equations, $Var(°C) = \frac{625}{81}$ and $Mean(°C) = \frac{2160}{81}$
Which gives me the required absolute difference as $\frac{1535}{81}$ which is incorrect.