Asymptotic rate of decrease of error function

580 Views Asked by At

The complementary error function is defined as $$ \text{erfc}(x) = 1 - \frac{2}{\sqrt{\pi}}\int_0^{x} e^{-t^2} dt $$ and is related to the Gaussian (Normal) distribution. Is there an approximation of the form $\exp(g(x))$ that converges to $\phi(x)$ asymptotically? i.e. can we find $g(x)$ such that
$$ \underset{x \rightarrow \infty}{\lim} \frac{\exp(g(x))}{\text{erfc}(x)} = 1 $$ and that $\exp(g(x))$ "approximates" $\text{erfc}(x)$ in some sense when $x$ is large but not infinite?

1

There are 1 best solutions below

2
On BEST ANSWER

There is one asymptotic expansion in the Handbook of Mathematical functions and also listed in your link; it is not quite in the form you wanted but close.

If $$\text{erfc}(z) = \frac{2}{\sqrt{\pi}} \int_z^\infty e^{-t^2}~ dt = 1 - \text{erf}(z)$$ then $$\text{erfc}(z)\sim \frac{1}{\sqrt{\pi}}e^{-z^2} \cdot \sum_{k=0}^\infty (-1)^k~\frac{(2k)!}{2^{2k}k! } \cdot \frac{1}{z^{2k+1}} $$ as $z\to\infty$, $\lvert \arg z \rvert < 3\pi/4. $

This can be adapted to the required form; for example, the first term is, $$\text{erfc}(z) = \frac{1}{\sqrt \pi} e^{-z^2}\cdot \frac{1}{z} = e^{-z^2-\log(z\sqrt{\pi})}$$ which gives $g(z) = -z^2-\log(z\sqrt{\pi})$.