An energy-saving lamp lights up an average of 10,000 hours before it fails, with a standard deviation of 800 hours.
What is the minimum burn time achieved by 90% of the lamps? A Gaussian distribution is given.
I used this formula:
$z = \frac{x - \mu}{\sigma }$
Transformed it to:
$x = \mu + z \cdot \sigma$
Which is:
$x = 10000 + 1.2815 \cdot 800 = 11.0250$
That is not the right result, what do I wrong?
Addendum
After trying a little bit around I came to the conclusion that the following equation transformed leads to the right result:
$z = \frac{x + \mu}{\sigma }$
Transformed:
$x = \mu - z \cdot \sigma$
Your z score had the wrong sign. Since it wants the minimum burn time achieved by 90% of the lamps you are looking for the 10th percentile, not 90th percentile. So -1.25 standard deviations below the mean.