I am imagining a normal distribution with a mean of zero and a standard deviation of S.
I know that the following function tells us the probability that a random deviate (under the assumption of normality) falls within +/- Z standard deviations from the mean:
erf(z/Sqrt[2])
I have what is likely a dumb question: how can I calculate the probability that the random deviate falls within z standard deviations to the right of the mean? That is, how do I calculate the probability that the random deviate falls within - z standard deviations from the mean? Do I simply multiply the above function by 0.5?
Any help would be greatly appreciated! I am very new to statistics.