Derivative of a negative order?

7.2k Views Asked by At

Below, $\Delta$ means taking the derivative, $\frac{d}{dx}$. For $n\in\mathbb{Z}$, $n\geq 0$, we have $$\Delta^n\sin{x}=\sin{(x+n\tau/4)} \\ \Delta^n\cos{x}=\cos{(x+n\tau/4)}$$ I found that out while thinking about $\sin$, $\cos$ and $\Delta$. I understand what $\Delta$ does (or at least, I think I do), but I'm wondering if it works if you raise it to a negative power, and if you can do it then what does it even mean to raise it to a negative power?

I've heard that an integral is in some way the opposite to a differential, so maybe it would have something to do with integrals? I'd just like to know if you can raise $\Delta$ to a negative power and if so, what exactly it means to do so. I am not sure what tags would best suit this, so feel free to suggest some.

3

There are 3 best solutions below

0
On

Negative powers of a differential operator are possible, and can be well defined. Your intuition is correct, it works out to an integral. Use negative numbers in your formula, and you'll see that $\Delta^{-n}$ applied to sine or cosine gives you the $n^\text{th}$ integral of that sine or cosine. There are theoretical ways to show this, maybe using integral transforms like the Laplace transform, or fancy operator theory.

To summarize: yes, you can have negative powers of differential operators, and it means that you're applying the inverse operator of differentiation. You can prove that the inverse operator of differentiation is integration.

0
On

Short answer: The operation $\Delta^{-1}$ you want is indeed integration, but there are some complications to constructing it as a well-defined operator.

Longer answer: For a continuous function $f$ on an interval $[a, b]\subset \mathbb{R}$, we have \begin{align*} \frac{d}{dx}\int^x_a f(t)\, dt = f(x) \end{align*} for $x\in [a, b]$. (In particular, the integral above is differentiable.) This is the fundamental theorem of calculus, and it's usually the main result in a first-year calculus class. As you imply in your question, though, we can think of differentiation as a linear operator $\Delta:C^\infty(X) \to C^\infty(X)$, where $C^\infty(X)$ denotes the space of smooth (i.e., infinitely differentiable) functions $f:X\to \mathbb{R}$ for some fixed compact interval $X\subset \mathbb{R}$. Does this map have an inverse? That is, is there some operator $\Delta^{-1}$ such that $\Delta \Delta^{-1} f = f$ and $\Delta^{-1} \Delta f = f$ for all $f\in C^\infty(X)$?

Well, no: $\Delta (1) = 0$, so $\Delta^{-1} \Delta(1)$ must vanish by linearity. On the other hand, we know that the only functions $f\in C^\infty(X)$ with $\Delta f = 0$ are constants, so it turns out we can define $\Delta^{-1}$ up to a constant. The relation $\Delta \Delta^{-1} f = f$ still holds; the left $\Delta$ kills the arbitrary constant. On the other hand, we now have $\Delta^{-1} \Delta f = f + C$ for some arbitrary constant $C$. (Once you get to integration in your class, the teacher will inevitably nag you about adding in the "$+C$" to indefinite integrals. That's the reason why; they're only defined up to an additive constant.)

Another complication is the fact that we're working in $C^\infty(X)$ above, even though we can still apply $\Delta$ to functions that are only differentiable once. We don't need differentiability to integrate; we can define $\Delta^{-1} f$ perfectly well for any continuous $f$ (and, in fact, we can reduce that assumption even further). If $f$ is continuous, then $\Delta^{-1} f$ is differentiable, and $\Delta \Delta^{-1} f = f$; the expression $\Delta^{-1} \Delta f$ is not defined if $f$ is merely continuous rather than differentiable, though. If you're familiar with linear algebra, the source of the complication here is that we're working over an infinite-dimensional space rather than a finite-dimensional one, so injectivity and surjectivity are not equivalent.

2
On

Generalized Differential for polynomials:

Given $$f\left(x\right)=\sum\limits_{\epsilon>0}^{1}\left(\sum\limits_{i\in\mathbb{Z}}^{}a_{i}x^{i+\epsilon}\right)$$ then $$f^{(k)}\left(x\right)=\sum\limits_{\epsilon>0}^{1}\left(\sum\limits_{i\in\mathbb{Z}}^{}\frac{\Pi\left(i+\epsilon\right)}{\Pi\left(i+\epsilon-k\right)}a_{i}x^{i+\epsilon}\right)$$

As anomaly noted, we need to adjust for the constant of integration, and those occur where $\Pi\left(n\right) = \Gamma\left(n+1\right)$ functions are undefined.

Edit: Forgot the parentheses to denote a kth derivative.