$O(\exp(\ln(x) \ln(\ln(x))^2)) = \sum_{n=0}^{\infty} a_n x^n$ and $0 < a_n$ asymptotics?

65 Views Asked by At

Consider for $x>e^e$ and $0<A<5$:

$$g_A(x) = \exp(\ln(x) \ln(\ln(x))^A)$$

I want to find a function $f_A(x)$ ,

$$f_A(x) = a_0(A) + a_1(A) x + a_2(A) x^2 + ... = \sum_{n=0}^{\infty} a_n(A) x^n$$

Where for every $n>0$ :

$$0 < a_{n+1}(A) < a_n(A)$$

and those $a_i$ are constants only depending on $A$, not on $x$.

And also

$$\lim_{x \to +\infty} \frac{g_A(x)}{f_A(x)} = 1$$

So basically $f_A(x)$ is a very good asymptotic for $g_A(x)$ and we want the Maclauren coefficients of $f_A(x)$ under the given restrictions.


Ok maybe that is asking alot.

Let us consider the simpler case and degeneralize,

Lets say $A=2$ then

$$g(x) = \exp(\ln(x) \ln(\ln(x))^2)$$

$$f(x) = a_0 + a_1 x + a_2 x^2 + ... = \sum_{n=0}^{\infty} a_n x^n$$

Where for every $n>0$ :

$$0 < a_{n+1} < a_n$$

And

$$\lim_{x \to +\infty} \frac{g(x)}{f(x)} = 1$$

What are good approximations of $a_n$ ?

I know

$$h(x) = \sum_{n=0}^{\infty} h_n x^n $$

with $h_n = c_0 c_1^{- n^{c_2}}$ is close to $\exp(c_3 \ln(x)^{c_4})$ for the appropriate positive constants $c_0,c_1,c_2,c_3,c_4$

Therefore $a_n < c_0 c_1^{- n^{c_2}}$ is for certain.

So some guesses are

$$a_n = \exp(- \exp(n)) ?$$

$$a_n = \exp(- n!)$$

$$a_n = \exp(- \exp(\ln(n)^2)) ?$$

How to solve this ?


What I tried.

  1. Integral analogue

$$g(x) = \int_1^{+\infty} a(t) x^t dt$$

  1. Contour integral ideas

  2. The heuristic (known as " fake function theory " by some)

$$a_n < Min [ \frac{g*(x)}{x^n} ]$$

Where $g*$ is the same as $g$ but $\ln(x)$ is replaced by an asymptotic for it that is defined well for all $x>0$ such as $ArcSinh(x/2)$.

Maybe I made a mistake but no sharp results came to me from those.


edit

I think

$$O(\exp(\exp(\ln(x) \ln(\ln(x))^2))) = \sum_{n=0}^{\infty} k_n x^n$$ and $$0 < k_n$$

holds for

$$\frac{1}{k_n} = O((n \ln(\ln(n))^l)^{m n \ln(\ln(n))^l})$$

and some real $l>1/4$ and real $m>0$.

That looks similar. Maybe it could help. Assuming it is true ofcourse. Even so that is not the most efficient way probably. Just sharing the idea.

1

There are 1 best solutions below

1
On

Not much of an answer but based on the heurstic 3 " fake function theory " (described in the OP) , there is the asymptotic formula :

For the conditions (more or less, the theory is unfinished)

$$ F(x) < \sqrt x $$ and $F(x)$ strictly increasing and $F'(x)$ strictly decreasing ;

$$ x^{F(x)} = O( \sum \frac{x^n}{G(n-1)} )$$

where $G(x)$ is the functional inverse of $F(x)$ and $O(*)$ is big-O notation.

This gives as estimate here

$$g_A(x) = x^{\ln(\ln(x))^A} = O( \sum \frac{x^n}{\exp(\exp((n-1)^{1/A}))} ) $$

I think I can push it a bit further, but this estimate is brutal.

A much better estimate is desired.

See also this topic :

https://mathoverflow.net/questions/219428/asymptotics-to-taylor-expansions

or the same one here at MSE :

Proof that $\oint_r d(x,N + n) < 0 $?

which implies when true ! for $x>>1$:

$$ \sum \frac{ x^n}{\sqrt n \ln(e+n) \exp(\exp((n-1)^{1/A}))} ) < x^{\ln(\ln(x))^A} < \sum \frac{\sqrt n \ln(e+n) x^n}{\exp(\exp((n-1)^{1/A}))} ) $$

But I do not want to force a specific viewpoint to the idea.

Maybe this is overkill and too complicated for the considered question and there exist simpler more efficient ways.

Just my 2 cents.

Btw $Min$ (minimum) means $Inf$ (infimum) and we usually compute it by taking the derivate of it and matching it to zero.

See also this link maybe to clarify :

Post nr 9 of " fake function theory " :

https://tetrationforum.org/showthread.php?tid=863&pid=6995#pid6995

( do not be confused by the semi-exp here ; f(f(x)) = exp(x) ... yeah it is at a teration forum , but the idea is very general so it works for general cases )