I have a mean and a desired range for a normal distribution, how do I discover what standard deviation I need? I may not be using the correct terminology so here's a graph:

Based on this, if you have a mean of 0 and range that's about -1.5 to 1.5 the graph shows you need to use a standard deviation the square root of 0.2 (the blue line).
Here's some examples of the problems I'm trying to solve:
If my mean is 100 and I want the range to be from 90 to 110, how do I figure out what standard deviation I need?
If my mean is 50 and I want the range to be from 0 to 100, how do I figure out what standard deviation I need?
To be completely precise, the normal distribution has an infinite range - because no matter how high you go, there's still some probability of going higher.
In practice, your question makes perfect sense - truly normal distributions don't go beyond, say, 10 standard deviations more than once every supercalifragilisticatillion times.
Normally (pardon the pun) you'll define something like a "confidence interval" - so if you say "X is between a and b" you'll only be wrong 5% of the time, or 1% or so on. The probability $\alpha$ you choose defines a "critical value" $z_{\alpha/2}$ - so, if you want to be wrong only 5% of the time, your critical value will be 1.96, because $z_{0.025}=1.96$.
Then, you can work out your interval from your mean and standard deviation:
$\mu\pm z_{\alpha/2}\sigma$
If you already have an interval, say $(a,b)$, you just let $a=\mu-z_{\alpha/2}\sigma$ and $b=\mu+z_{\alpha/2}\sigma$ and solve for $\mu$ and $\sigma$. This will give
$\mu=\frac{a+b}2$
and
$\sigma=\frac{b-a}{2z_{\alpha/2}}$