Calculating the mean of the random variable $g(x)=\frac{1}{(1.1-x)^2}$ where x is has uniform distribution $U(0,1)$:
$E[g(x)]=\int\limits_{0}^{1}\frac{1}{(1.1-x)^2}dx=\frac{1}{0.1}-\frac{1}{1.1}=9.09$ . Finding the variance first find the second moment $E[(g(x))^2]=\int\limits_{0}^{1}\frac{1}{(1.1-x)^4}dx=\frac{1}{3(1.1-x)^3}|_0^1=333.08$.
$Var =E[x^2]-(E[x])^2=333.08-9.09^2=250.45$
and the standard deviation is the squere root of the variance , i.e., $15.82$.
My question is: If the range on $g(x)\in(0,1)$ is from $0.909$ to $100$ and the mean is $9.09$ how can the standard deviation be $15.82$.
This means that if $68\%$ of the results are within $9.09$ plus/ minus $15.82$, then some of the results is not within the range. If the result is somhow scewed- how will I the read the standard deviation?