Minimum Argument Difference to Make the Lower Bound > the Upper Bound

156 Views Asked by At

Assume $g$ is a function that grows asymptotically as $$ g(n) \in\frac n {log(n)} + O(\sqrt n),\,n \in \Bbb N\tag1 $$ I wish to find $h(n)$ such that $$ g(n) \le g(n+h(n)). $$ i.e. Given the bounds $$ \begin{align} u(n)=\frac n {log(n)} + \sqrt{n} \tag 2 \\ l(n)=\frac n {log(n)} - \sqrt{n} \tag 3 \end{align} $$ what $h(n)$ would give $$ u(n)\le l(n+h(n)) \tag 4 $$ Since $h(n)$ would simply be the difference of the inverse functions of $(2)$ and $(3)$, my approach so far has been to look for these. The inverse of $f(n)=n/log(n)$ is given by $$ f^{-1}(n)=e^{-W(-1/n)} \tag 5 $$ which has the multivalued Lambert W function, where the principal branch for reals stops at $e$. The analytic continuation is $$ f^{-1}(n) = n \log(n \log(n \log(...\,\,,\,\, n>e \tag 6 $$ but I haven't found a way to express the inverse of the bounding functions. Maybe I need a taylor series or some integral to approximate these?


Proof of $(6)$

The inverse of $y = {x / log(x)}$ is $$ x = {y / log(y)}.\tag7 $$ Assume $(7)$ can be equated to $$ y=x\,log(x\,log(x\,log(x\,...\tag8 $$ The expression inside the first log function is $y$, so we have $$ y=x\,log(y).\tag9 $$ Dividing through by $log (y)$ confirms the original assumption (for $x>e$). $(8)$ returns the upper branch value (the area of interest), whereas $(5)$ returns only the principal value.


Addendum
Apparently, $(1)$ is weaker than I intended. How could it be made stronger, such that $(2)$ and $(3)$ are strict bounds? My guess is that substituting $O()$ with something like $$ o(x^{\frac 1 2 - \epsilon})\tag{10} $$ would eliminate the implied constant of $O()$, and reflect my intention that
$$ \forall n \in \Bbb N, l(n) < g(n) < u(n)\tag{11} $$
and $(4)$ holds such that $$ \lim_{n\to \infty} \frac {l(n+h(n))}{u(n)} \to 1^+.\tag{12} $$

1

There are 1 best solutions below

3
On

It turns out that the optimal choice of $h(n)$ depends on the constant implicit in the $O(\sqrt{n})$ term, so redefine

$$ u(n) = \frac{n}{\log n} + c\sqrt{n} $$

and

$$ l(n) = \frac{n}{\log n} - c\sqrt{n}, $$

where $c > 0$ is fixed. Assuming $h = o(n)$, solving the equation

$$ u(n) = l(n+h) $$

asymptotically for $h$ yields

$$ h(n) \sim 2c\sqrt{n}\log n. \tag{1} $$

Further, it's possible to show that

$$ \lim_{n\to\infty} u(n) - l(n+2c\sqrt{n}\log n) = \infty \tag{2} $$

and that, for any $\epsilon > 0$,

$$ \lim_{n\to\infty} u(n) - l(n+(2+\epsilon)c\sqrt{n}\log n) = -\infty. \tag{3} $$

Consequently, if

$$ l(n) \leq g(n) \leq u(n) $$

for $n$ large enough then for any $\epsilon > 0$ there exists an $n_0(\epsilon)$ such that

$$ g(n) \leq g(n+(2+\epsilon)c\sqrt{n}\log n) \tag{4} $$

for $n \geq n_0(\epsilon)$.

(Note that we cannot take $\epsilon = 0$ in general, since any function $g(n)$ which oscillates between $l(n)$ and $u(n)$ will satisfy

$$ g(n) > g(n+2c\sqrt{n}\log n) $$

infinitely often. For a concrete example, take $g(n) = n/\log n + c\sqrt{n}\sin n$.)


Proof of $(1)$.

If we expand

$$ \begin{align} l(n+h) &= \frac{n+h}{\log(n+h)} - c\sqrt{n+h} \\ &= \frac{n+h}{\log n + \log(1 + h/n)} - c\sqrt{n+h} \\ &= \frac{n+h}{\log n} \frac{1}{1 + \frac{\log(1+h/n)}{\log n}} - c\sqrt{n}\sqrt{1+h/n} \\ &= \frac{n+h}{\log n} \left[1 + O\left(\frac{h}{n\log n}\right)\right] - c\sqrt{n} \left[1 + O\left(\frac{h}{n}\right)\right] \\ &= \frac{n+h}{\log n} + O\left(\frac{h}{(\log n)^2}\right) - c\sqrt{n} + O\left(\frac{h}{\sqrt{n}}\right), \end{align} $$

then the equation $l(n+h) - u(n) = 0$ becomes

$$ \frac{h}{\log n} + O\left(\frac{h}{(\log n)^2}\right) - 2c\sqrt{n} + O\left(\frac{h}{\sqrt{n}}\right) = 0. \tag{5} $$

A dominant balance is achieved if

$$ \frac{h}{\log n} \asymp 2c\sqrt{n}, $$

so it follows that $h \sim 2c\sqrt{n} \log n$.


Proof of $(2)$ and $(3)$.

We know from $(5)$ that, for $h = o(n)$,

$$ u(n) - l(n+h) = 2c\sqrt{n} - \frac{h}{\log n} + O\left(\frac{h}{(\log n)^2}\right) + O\left(\frac{h}{\sqrt{n}}\right). $$

By taking $h = (2+\varepsilon)c\sqrt{n}\log n$ with $\epsilon > 0$ we see that

$$ u(n) - l(n+h) \sim -\epsilon c \sqrt{n} \to -\infty, $$

which proves $(3)$.

Proving $(2)$ is only slightly more difficult; we need to expand the first term of $l(n+h)$ to one more order. I'll omit the calculations here and just state the result that, for $h = o(n)$,

$$ u(n) - l(n+h) = 2c\sqrt{n} - \frac{h}{\log n} + \frac{h}{(\log n)^2} + O\left(\frac{h^2}{n(\log n)^2}\right) + O\left(\frac{h}{\sqrt{n}}\right). $$

By taking $h = 2c\sqrt{n}\log n$ we see that

$$ u(n) - l(n+h) \sim \frac{2c\sqrt{n}}{\log n} \to \infty, $$

which proves $(2)$.