How would you approximate the magnitude of the probability of an event with an extremely large number of standard deviations? All numerical approximations that I have found have far too large relative errors or do not handle large σ well.
My first rough estimate would be that the value of in 10^- increases by half the value of σ for each additional σ.
For example, for a 670,000σ event, would approximately be half the sum of 1 to 670,000 = 224,450,335,000/2 ≈100,000,000,000.
I'd like to know if this would be roughly correct, and if there is a better approximation.
It depends on the distribution, of course. If $X$ has a normal distribution with mean $\mu$ and standard deviation $\sigma$, then as $n \to \infty$ $$ \mathbb P(X \ge \mu + n \sigma) = \frac{\exp(-n^2/2)}{\sqrt{2 \pi}} \left(\frac{1}{n} - \frac{1}{n^3} + \frac{3}{n^5} + O(1/n^7)\right)$$ Asymptotic results for other distributions are also available.