Asymptotic behaviour of $E_1(-x) = \Gamma(0,-x)$.

269 Views Asked by At

Updated question: I think my original question may have been a bit weird, mainly because taking the absolute value of the exponential integral makes little sense, but I have finally achieved something myself by using the defining property $$ \Gamma\left(a+1,z\right)=a\Gamma\left(a,z\right)+z^{a}e^{-z}.$$ This provides valuable information on the behaviour of $E_x(-x) = \Gamma(1-x,x)$. However, to determine this behaviour, it is necessary to say something about $$\Gamma(0,-x) = E_1(-x).$$ It seems that $$\Im(E_1(-x)) = -\pi i.$$ This is also logical when looking at Equation 6.6.2 in the Online Library, since it comes from the $\text{Log}(-x)$ which seems to be defined as $\text{Log(x)} + \pi i$. However, analysing the real part of $E_1(-x)$ seems harder, since the estimates in the online library seem to generally be on $E_1(x)$ for $x \to \infty$ instead. For this reason I am changing my question to the question if someone can provide me with the behaviour of the function $E_1(-x)$ as $x \to \infty$. From what I have found so far, it seems that $$\Re(\Gamma(0,-x)) = -\frac{e^x}{x} - \frac{e^x}{x^2} - 2\frac{e^x}{x^3} - 6\frac{e^x}{x^4} - ... - (m-1)!\frac{e^x}{x^m}.$$ I don't have much basis for this claim, however. It seems to be true when testing. The way I found it is that when using the whole recurrence relation described above and assuming $\Gamma(0,-x)$ is some sort of series, then every time the highest order term seems to disappear if and only if the series looks like this. My mathematical understanding is just not good enough to show why this would be the case, but it seems to happen to be this way. Maybe it could be achieved by rewriting Equation 6.6.2 in the online library? That seems of a somewhat similar form. Any insight is appreciated.


Edit: As elemelons pointed out this is the case. Combining this with the recurrence gives me some good ideas why the incomplete Gamma with both arguments negative behaves the way it does. This doesn't fully answer my original question yet where I was dealing with a factor $a$, but it does provide the enough I need to get on with trying myself.


Original title clarification: $f = \Omega(g)$ means $g = O(f)$ in Bachmann-Landau notation. The norm $\Vert \Vert$ is the Euclidian norm on complex numbers, $a$ is some real and positive constant and I want to show this for $m$ some arbitrary (positive) value. In other words: I hope to prove that the function $\Vert E_x(-\frac{x}{a}) \Vert$ in $x$ grows at least as quickly as $\frac{1}{x^m}e^{\frac{x}{a}}$ for some $m$.


Original question: I have been working on a problem which includes the exponential integral function $E_p(z)$, which can be defined as $$ E_{p}\left(z\right):=z^{p-1}\Gamma\left(1-p,z\right), $$ which equals $$E_p(z) = z^{p-1} \int_{z}^{\infty} \frac{e^{-t}}{t^p} \text{d}t,$$ when this integral is defined (I think in particular when the real part of $z$ is positive). This information comes from the Digital Library of Mathematical Functions, Chapter 8.19

I have a function which contains the term $$\Vert E_x(-\frac{x}{a}) \Vert e^{-\frac{x}{a}}$$ where $a$ is a real and positive constant. The norm $\Vert \Vert$ is here the Euclidian norm on complex numbers. I have been trying to determine the behaviour of this function as $x$ grows large, but I am having a lot of trouble so the only real ideas I have are from testing. It seems that no matter the value of $a$, I am getting $$\Vert E_x(-\frac{x}{a}) \Vert e^{-\frac{x}{a}} = \Omega(\frac{1}{x}).$$ In particular it seems that $$\Vert E_x(-\frac{x}{a}) \Vert e^{-\frac{x}{a}} = O(\frac{1}{x})$$ holds as well, if and only if $0 < a < 1$. Here I am using $f = \Omega(g)$ as meaning $g = O(f)$, so I am expecting $$\frac{1}{x} \leq C \Vert E_x(-\frac{x}{a}) \Vert e^{-\frac{x}{a}}$$ for all $x \geq x_0$. It seems $C = 1$ works in particular. However, I would actually be happy with way looser estimates, like showing that this product is $\Omega(\frac{1}{x^m})$, which is equivalent to my title.

In the Digital Library of Mathematical Functions I found a lot of results that seem promising, but with which I didn't achieve much. In particularly I thought that one found in Chapter 8.20 looked good, namely that as $p \to \infty$ we have $$ E_{p}\left(\lambda p\right)\sim\frac{e^{-\lambda p}}{(\lambda+1)p}\sum_{k=0}^{% \infty}\frac{A_{k}(\lambda)}{(\lambda+1)^{2k}}\frac{1}{p^{k}}.$$ This mainly seems useful because I already see the $\frac{e^{x}}{x}$ which I asymptotically expected appear. However, I later realised that this works for $\lambda > 0$, and in my case $\lambda = -\frac{1}{a}$. The same happened when I tried to use the incomplete $\Gamma$ definition: since $x$ is positive and $-\frac{x}{a}$ negative the integral representation cannot be used, and it is hard to figure out what can be used.

I am hoping that somebody has any ideas, since I am completely clueless. Clearly $E_p(z)$ has some kind of behaviour even for $z$ negative, in particular $E_x(-\frac{x}{a})$ does. It should also have some asymptotic behaviour and I feel as if the testing gives very consistent results. Yet, I don't know how to better analyse this behaviour. Any help in establishing the behaviour I expect or any other estimates is much appreciated.

2

There are 2 best solutions below

1
On BEST ANSWER

The integral

$$ \mathrm{E}_1(-x) \,=\, \int_{-x}^\infty\frac{e^{-t}}{t}\,\mathrm{d}t \,=\, -\int_{-\infty}^x \frac{e^t}{t}\,\mathrm{d}t \,=\, -\mathrm{Ei}(x) $$

is defined around $t=0$ using the Cauchy principal value. This means we can write

$$ \mathrm{Ei}(x)=\lim_{h\to0^+}\Biggl[\int_{-\infty}^{-h}\frac{e^t}{t}\mathrm{d}t+\int_h^x\frac{e^t}{t}\mathrm{d}t\Biggr]. $$

Or, if we split the domain of integration into $(-\infty,-x)$ and $[-x,x]$, the latter we can rewrite the c.p.v. by combining two integrals into one, in other words

$$ \mathrm{Ei}(x)=\int_{-\infty}^{-x}\frac{e^t}{t}\,\mathrm{d}t+\int_0^x\frac{e^{-t}-e^t}{t}\,\mathrm{d}t=\mathrm{Ei}(-x)-2\mathrm{Shi}(x), $$

where $\mathrm{Shi}(x)$ is the hyperbolic sine integral.


The exponential integral has the well-known asymptotic series

$$ \mathrm{Ei}(x) \sim \frac{e^x}{x}\sum_{n=0}^\infty\frac{n!}{x^n}. $$

Obviously, this series does not converge (though do note the summands are amusingly just the summands of $e^x$'s Taylor series flipped); it is an asymptotic expansion. This means the difference between the function and the $n$th partial is asymptotic to the $(n+1)$st term. That is, we say $f(x)\sim\sum_{n=0}^\infty f_n(x)\,$ if $\,[f(x)-\sum_{n=0}^{N-1}f_n(x)]\sim f_N(x)$ for all $N$ (and recall $f\sim g$ means $\lim\limits_{x\to\infty} f(x)/g(x)=1$).

The asymptotic expansion is derived from the defining integral representation by repeated applications of by-parts, using the substitution $e^t\mathrm{d}t=\mathrm{d}e^t$ each time.

0
On

I shall assume that $\left| {\arg x} \right| \le \pi - \varepsilon < \pi$ and $ \left| {\arg \alpha } \right| \le 2\pi - \varepsilon < 2\pi$. In terms of the incomplete gamma function $$ E_x ({\rm e}^{ \pm \pi {\rm i}} \alpha x) = ({\rm e}^{ \pm \pi {\rm i}} \alpha x)^{x - 1} \Gamma (1 - x,{\rm e}^{ \pm \pi {\rm i}} \alpha x) = x(\alpha x)^{x - 1} {\rm e}^{ \pm \pi {\rm i}x} \Gamma ( - x,{\rm e}^{ \pm \pi {\rm i}} \alpha x) - (\alpha x)^{ - 1} {\rm e}^{\alpha x} . $$ Then, by $(8.12.5)$, $$ \frac{{\Gamma (x + 1)}}{{2\pi {\rm i}}}{\rm e}^{ \pm \pi {\rm i}x} \Gamma ( - x,{\rm e}^{ \pm \pi {\rm i}} \alpha x) \sim \mp \frac{1}{2}\operatorname{erfc}\left( { \pm {\rm i}\beta \sqrt {x/2} } \right) + {\rm i}\frac{{{\rm e}^{\frac{1}{2}x\beta ^2 } }}{{\sqrt {2\pi x} }}\sum\limits_{k = 0}^\infty {\frac{{c_k (\beta )}}{{( - x)^k }}} \tag{$*$} $$ as $|x|\to+\infty$ uniformly with respect to $\alpha$. Here $$ \beta = \sqrt {2(\alpha - 1 - \log \alpha )} $$ and the branch of the square root is chosen so that $\beta (\alpha ) \sim \alpha - 1$ as $\alpha \to 1$.

If $\alpha$ is bounded away from $1$, then the expansion $(*)$ may be simplified by using $(7.12.1)$.