Explicit relation between variance( or standard daviation) and x ( the values taken by the random variable)

34 Views Asked by At

So I am trying to find the relationship between the variance or the standard deviation and the values taken by a random variable.

I am trying to put a definition, so fro example : for any $x$ value from a normal distribution $N(\mu,\sigma)$, we have $x=$some thing in term of $\sigma$ and $\mu$.

I think $\sigma $ here plays a major role than $\mu$, since the range of the values taken around the mean is related to the variance and not the vlaue of the mean.

Is this notion some thing popular and I am missing it? If not, can some one please help me drawing such definition?

Thank you in advance.