I have not done stats for several years and seem to have forgotten the basics.
I am trying to find the standard deviation of a normal distribution given a desired mean and cumulative probability between $a \in \mathbb R$ and $b \in \mathbb R$, with $a < b$.
For example, if I want $\mu = 5, a = 0, b = 10, \text{cumulative(a,b) = 98.76%}$, I would get $\sigma = 2$ ($\text{cumulative(a,b)}$ is the cumulative probabilty from $a$ to $b$). The distribution looks like this:
Right now, I am using http://onlinestatbook.com/2/calculators/normal_dist.html and by trial and error (changing standard deviation and looking at resulting probability) to get desired standard deviation.
What is a formula I can use to get $\sigma$ of a normal distribution such that the probability between $a \in \mathbb R$ and $b \in \mathbb R$, with $a < b$ is $x \in [0,1]$?

If you know what $\operatorname{cumulative}(-\infty,a)$ or $\operatorname{cumulative}(-\infty,b)$ is, you can find $\sigma$ through standardizing your distribution. Consider the standard normal distribution, $Z\sim N(0,1)$ (i.e. a normal distribution with $\mu=0$,$\sigma=1$).
Now, $z=\frac{x-\mu}{\sigma}$. In other words, $\operatorname{invornm}(\operatorname{cumulative}(-\infty,a))=\frac{a-\mu}{\sigma}$. Repeating this with $b$, we get a pair of equations which we can use to solve for $\sigma$. This technique is known as standardizing the normal distribution.