Is there an easier way to find the "natural" integration constant?

486 Views Asked by At

Suppose we take consequtive derivatives of a function at a point and then interpolate them with Newton series (Newton interpolation formula) so to obtain a smooth curve.

$$f^{(s)}(x)=\sum_{m=0}^{\infty} \binom {s}m \sum_{k=0}^m\binom mk(-1)^{m-k}f^{(k)}(x)$$

If the series converges at $s=-1$ we take this value to be the "natural" value of antiderivative of $f$ at the point $x$ (assuming that integral is the -1-th derivative).

For instance, for function $f(x)=a^x$ the expansion converges (if converges, which is not the case for all $a$) to $a^x (\ln a)^s$, or $(\ln a)^s$ at $x=0$. Thus antiderivative of $a^x$ should naturally have value of $\frac{1}{\ln a}$ at $x=0$.

Is there an easier way to obtain this value, and possibly, more universal (working where the series diverges)?

2

There are 2 best solutions below

0
On BEST ANSWER

Well, using the exponential Fourier transform for non-periodic functions from this paper one can derive at least one additional method:

$$f(x)=\frac1{2\pi}\int_{-\infty}^{+\infty} e^{i\omega x} \int_{-\infty}^{+\infty}f(t)e^{-i\omega t}dt \, d\omega $$

integrating by $x$ and applying the natural integration of exponent rule we get:

$$f^{(-1)}(x)=\frac1{2\pi}\int_{-\infty}^{+\infty} \frac{e^{i\omega x}}{i\omega} \int_{-\infty}^{+\infty}f(t)e^{-i\omega t}dt \, d\omega $$

Now, for $x=0$, we obtain:

$$f^{(-1)}(0)=\frac{i}{2\pi}\int_{-\infty}^{+\infty} \frac{1}{\omega} \int_{-\infty}^{+\infty}f(t)e^{i\omega t}dt \, d\omega $$

Check if I am wrong.

Unfortunately this method converges even more rarely. One function for which it works is $ f(x)=x e^{-x^2}$, in this case $f^{(-1)}(0)=-\frac1{2}$. For $f(x)=e^{-x^2}$ the method gives $f^{(-1)}(0)=0$

Using tables for Fourier transform we can also get for $f(x)=\sin x$, $f^{(-1)}(0)=-1$, for $f(x)=\cos x$, $f^{(-1)}(0)=0$, for $f(x)=(\sin x)^3$, $f^{(-1)}(0)=-\frac23$.

9
On

Here is an interesting way we can go about this. It can be shown that (see this), if we replace $f(x)$ with $F$, and then replace $F^n$ with $f(x+nh)$, we have

$$f^{(n)}(x)=\lim _{h\rightarrow0}\left( \frac{F-1}h\right)^n$$

Notice the neat fact that $\lim\limits_{h\to0}\frac{F-1}{h}=f'(x)$. For example, expanding the above for $n=2$ gives the limit formula

$f''(x)=\lim _{h\rightarrow0}\left( \frac{F-1}h\right)^2=\lim _{h\rightarrow0}\left( \frac{F^2-F+1}{h^2}\right)=\lim _{h\rightarrow0}\left( \frac{f(x+2h)-2f(x+h)+f(x)}{h^2}\right)$

which is readily verified using L'Hopital. This has the disadvantage of sometimes evaluating derivatives when they don't exist, but will always correctly evaluate the derivative if it exists. If we consider $f^{(-1)}(x)$ to be the integral of $f(x)$, then we have

$$f^{(-1)}(x)=\lim _{h\rightarrow0}\left( \frac{F-1}h\right)^{-1}=-\lim_{h\rightarrow0}h\cdot\left( \frac{1}{1-F}\right)=-\lim_{h\to0}h \sum_{k=0}^{\infty}F^k$$

Since the limit of a series is unique if it exists, then we can define (using formal power series if necessary) the "natural" antiderivative of $f(x)$ as

$$f^{(-1)}(x)=-\lim_{h\to0}h \sum_{k=0}^\infty f(x+hk)$$

Unfortunately, the sum is not always well-defined (i.e. $f(x)=x$ obviously cannot be summed properly), but it does, for example, define $\int \cos(x)=\sin(x)$, or $\int e^x=e^x$.

One advantage of this derivation is that it is easily generalised for any order integral or derivative. Using the fact that $(F-1)^{n}=\sum\limits_{k=0}^\infty{n\choose k}F^k(-1)^{n-k}$, we have

$$f^{(n)}(x)=\lim _{h\rightarrow0}\left( \frac{F-1}h\right)^n=\lim _{h\rightarrow0}\left( \frac{(F-1)^{n}}{h^n}\right)=\lim_{h\to0} h^{-n} \sum_{k=0}^\infty{n\choose k} (-1)^{n-k}f(x+hk)$$ is the "natural" differentigral for any n where the above is defined (even non-integer $n$).

EDIT:For $e^x$,$-\lim\limits_{h\to 0}h \sum\limits_{k=0}^\infty e^{x+hk}=-\lim\limits_{h\to 0}he^x \sum\limits_{k=0}^\infty e^{hk}=-\lim\limits_{h\to 0}\frac{he^x}{1-e^h}=e^x$. Note that the sum is a formal power series; the sum above does not actually converge whenever $h>0$, and only sort of represents adding them all together. When $h>0$, it does not converge, but the most logical value would be extending it using it the geometric series formula. When $h<0$, the series converges, and the $h$ cancels the negative.

For $\sin(x)$ and $\cos(x)$, their integrals are the imaginary and real parts of $-\lim\limits_{h\to 0}h \sum\limits_{k=0}^\infty e^{i (x+hk)}$, respectively. This evaluates similarly:

$-\lim\limits_{h\to 0}h \sum\limits_{k=0}^\infty e^{i (x+hk)}=-\lim\limits_{h\to 0}he^{ix} \sum\limits_{k=0}^\infty e^{ihk}=-e^{ix}\lim\limits_{h\to0}\frac{h}{1-e^{ih}}=-ie^{ix}=i (-\cos(x))+\sin(x)$