Help needed - Approximating functions with geometric integration and derivation

85 Views Asked by At

I've somehow managed to approximate some functions using cheap tricks as geometrically derivating the function and then geometrically integrating an easier equivalent of the derivative (see here for an example).

However, the proof I finally got in the example above worked 'mathematically' because I managed to get rid of the $\text{o}(1/x^\left(whatever\right))$ in my approximation of $x!$ by using $\frac{\text{d ln}(x!)}{\text{d}x}=\frac{\text{d ln}(\Gamma (x+1))}{\text{d}x} = \psi (x+1)$.

Had I used $x!=u(x)+\text{o}((1/x^\left(whatever\right))$, I would have had to take into account the $\frac{\text{d ln}(\text{o}((1/x^\left(whatever\right)))}{\text{d}x}$, and I have absolutely no idea how to deal with that.

Therefore, I was wondering, when exactly can I use such a method to get approximation of functions ?


To sum it up my question is : When can we approximate a function by geometrically integrating an approximation of its geometric derivative ?

1

There are 1 best solutions below

0
On BEST ANSWER

Although your question is about integrating an estimate, your example is about taking derivatives. There is usually no problem integrating an estimate, although cancellation may give a better estimate. In any case, I will discuss the problem described in the question regarding derivatives.


Differentiating A Landau-style Estimate

The problem you are having is because $o(\dots)$ estimates a size, but not a smoothness. If you want to estimate a derivative, you need to have some information on smoothness.

For example, if $f(x)=\dfrac1x$, then as $x\to\infty$, $$ f(x)=O(1/x)\tag{1} $$ and $$ f'(x)=O(1/x^2)\tag{2} $$ However, if $g(x)=\dfrac{\sin(x^3)}{x}$, then as $x\to\infty$, $$ g(x)=O(1/x)\tag{3} $$ but $$ g'(x)=O(x)\tag{4} $$


Information, such as log-convexity, can help estimate derivatives. For example, since $$ x\Gamma(x)=\Gamma(x+1)\tag{5} $$ we get, by taking the log of $(5)$, $$ \log(x)+\log(\Gamma(x))=\log(\Gamma(x+1))\tag{6} $$ knowing that $\Gamma(x)$ is log-convex lets us know that the derivative of $\log(\Gamma(x))$ is increasing. Since $$ \hspace{-1cm}\small\log(x-1)=\frac{\log(\Gamma(x))-\log(\Gamma(x-1))}{x-(x-1)}\le\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))\le\frac{\log(\log(\Gamma(x+1))-\Gamma(x))}{(x+1)-x}=\log(x)\tag{7} $$ we can determine $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))$ to within an error of $\log(x)-\log(x-1)\sim\frac1x$. That is, $$ \frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))=\log(x)+O(1/x)\tag{8} $$


Integrating A Landau-style Estimate

If $f(x)=O(g(x))$ (implying that $g(x)\gt0$), then $$ \int_0^xf(t)\,\mathrm{d}t=O\left(\int_0^xg(t)\,\mathrm{d}t\right)\tag{9} $$ However, we can often do better because of cancellation. For example, by integrating $\cos(x)=O(1)$, we get that $\sin(x)=O(x)$, which is true. However, cancellation between positive and negative parts of $\cos(x)$, we actually get that $\sin(x)=O(1)$.