Distribution derivatives confusions.

77 Views Asked by At

Imagine $H(x)$ is a heavy side unit step function(it's 1 when $x>0$ and it's 0 when $x<0$).

As we say, every function is a distribution, so I can do:

$H[g] = \int_{\infty}^{\infty} H(x)g(x)dx$ (I denoted the distribution with the same letter as most people do the same).

Now I differentiate this, $H'[g] = \int_{\infty}^{\infty} H'(x)g(x)dx$.

Now, problem I have is I don't know how to prove that $H'[g] = -H[g']$. I only seem to show this equality when I do integration by parts on $H'[g]$, but now, Problem is $H'(x)$ is not a function, so I can't do integration by parts.

I am not sure, but I think it's still said that even in this case, $H'[g] = -H[g']$ is still correct because of definition, but why ? definition seems to hold true when $H'(x)$ is a function. Hope I made sense.

I'm not learning distribution theory in deep level. I need this to confirm to myself that I understand why $\int_{\infty}^{\infty} H'(x)g(x)dx = g(0)$. Unless, I show that $H'[g] = -H[g']$ even in case when $H'(x)$ is not a function in classical sense, then I'm really lost.

2

There are 2 best solutions below

2
On

Welcome to the Mathematics StackExchange, Giorgi!

I agree with commenters that, for any distribution $ f $, the definition of the derivative distribution $ f ' $ is that $ f ' [ g ] : = - f [ g ' ] $. The justification for this definition is that this is equal to the integral $ \int _ { - \infty } ^ \infty \, f ' ( x ) \, g ( x ) \, \mathrm d x $ (where the $ f $ here is a function that gives rise to the distribution) if $ f $ is a differentiable function. But if $ f $ is not a differentiable function (or if the distribution $ f $ doesn't come from a function at all), then this theorem doesn't hold, and that's what happens with the Heaviside step function $ H $ (which is not differentiable at $ 0 $).

If you approximate $ H $ with a smooth approximation, say $$ H _ n ( x ) = \cases { 0 & for $ x \leq - 1 / n $ \\ \displaystyle \frac { \displaystyle 1 } { \displaystyle \exp \Bigl ( \frac { 4 n x } { n ^ 2 x ^ 2 - 1 } \Bigr ) + 1 } & for $ - 1 / n < x < 1 / n $ \\ 1 & for $ x \geq 1 / n $, } $$ then you can approximate $ H ' $ with its derivative $$ H ' _ n ( x ) = \cases { 0 & for $ x \leq - 1 / n $ \\ \displaystyle \frac { \displaystyle n ( n ^ 2 x ^ 2 + 1 ) } { \displaystyle ( n ^ 2 x ^ 2 - 1 ) ^ 2 \cosh ^ 2 \Bigl ( \frac { 2 n x } { n ^ 2 x ^ 2 - 1 } \Bigr ) } & for $ - 1 / n < x < 1 / n $ \\ 0 & for $ x \geq 1 / n $. } $$

If you pick a smooth, compactly supported function $ g $ and evaluate $ \lim \limits _ { n \to \infty } \int _ { - \infty } ^ \infty \, H _ n ( x ) \, g ( x ) \, \mathrm d x $, you'll get $ \int _ 0 ^ \infty \, g ( x ) \, \mathrm d x $, which is equal to the integral $ \int _ { - \infty } ^ \infty \, H ( x ) \, g ( x ) \, \mathrm d x $. You can also pretty much recover the function $ H $ from the sequence of functions $ H _ n $ since $ H ( x ) = \lim \limits _ { n \to \infty } H _ n ( x ) $, although this is only guaranteed to work almost everywhere. (Indeed, your definition of $ H $ left it undefined at $ 0 $; and although my sequence of smooth approximations to $ H $ gives $ \lim \limits _ { n \to \infty } H _ n ( 0 ) = \frac 1 2 $, it's possible to pick a different sequence of approximations to get a different value at $ 0 $. Or if you really twist things, even a different value anywhere else!)

If you turn to the derivatives and evaluate $ \lim \limits _ { n \to \infty } \int _ { - \infty } ^ \infty \, H ' _ n ( x ) \, g ( x ) \, \mathrm d x $, then you'll get $ g ( 0 ) $, which cannot be expressed by integrating any actual function against $ g $. If you try to find such a function $ \delta $ by defining $ \delta ( x ) : = \lim \limits _ { n \to \infty } H ' _ n ( x ) $, then you'll get $ \delta ( x ) = 0 $ whenever $ x \ne 0 $, but $ \delta ( 0 ) $ will diverge to $ \infty $. (This is also what you get if you try to find $ H ' ( 0 ) $ by picking any value between $ 0 $ and $ 1 $ for $ H ( 0 ) $ and working out $ H ' ( 0 ) $ as a limit.) So while the derivative of the distribution $ H $ makes sense as a distribution $ \delta $ given by $ \delta [ g ] = - H [ g ' ] = g ( 0 ) $, this distribution (unlike $ H $ itself) cannot be represented by any function. And since $ H $ cannot be represented by a differentiable function, this is no surprise.

0
On

The mother of all distribution with point support for discontinuities is the $\text{abs}())$ that is defined on $S$ as the linear map by the conventional integral $$\text{abs}(f)=\int_{-\infty}^\infty \text{abs}(x) f(x)dx = \int_0^\infty x f(x)dx - \int_{-\infty}^0 x f(x) dx.$$ $$\text{abs'}(f) =-\text{abs}(f') = - \int_0^\infty x f'(x)dx+\int_{-\infty}^0 x f'(x) dx= \int_0^\infty f(x) dx-\int_{-\infty}^0 f(x) dx = \int_{-\infty}^\infty \text{sign(x)} f(x) dx\ $$

The sign distribution has the name of Heaviside, its onesided half is the Theta distribution, that can be seen as the distributional derivative of $\text{max}(x,0)$ $$\theta_x(f)= \int_x^\infty f(t)dt.$$ Algorithmically, observe the renaming of the local integration variable in the integral.

The derivative is $$\theta'(x)(f) = -\int_x^\infty f'(t) dt = f(x) = \delta(x)(f)$$

Because the test function space $S$ is the space of infinetely often differentiable functions with finite integrals over the real line for any derivative, distribtions are infintely often differentiable, too.

All distributions with point support can be evaluated by partial integration and evaluation on intervall boundaries, e.g.

$$\delta'(x)(f) = - \delta(x)(f') = \theta(x)(f'') = \int_x^\infty f''(t) dt = -f'(x)$$