Dirac delta of a function with zero derivative

2.8k Views Asked by At

It is known that:

$$\int_{-\infty}^\infty f(x) \, \delta(g(x)) \, dx = \sum_{i}\frac{f(x_i)}{|g'(x_i)|}$$

Where $x_i$ are the roots of $g(x)$.

My question is, what happens when $g'(x_i)$ is zero, but $f(x_i)$ is not? It seems that the integral on the left shoul exist irrespective of the value of $g'(x)$, so:

Is there a different formula for the integral one should use in this case, or conversely, is this indeed an indicator that the integral diverges?

Bounty Edit

As user88595 has explained this may be a consequence of $g(x)$ not having a simple root. I'm looking for a proof or a counterexample that for any $g(x)$ which does not have a simple root at $x_i$, the integral diverges.

Edit:

I thought I'd give a concrete example. Let's first look at: $$\int_{-\pi/2}^{\pi/2}e^x \delta\left(\sin(x)\right)dx$$ Since $\sin x$ only has one zero in the interval, and since the derivative at zero is one, the integral is equal to $e^0=1$. Now look at: $$\int_{-\pi/2}^{\pi/2}e^x \delta\left(\sin^3(x)\right)dx$$ Which leads to infinity by the above rule (since $g'(x) = 0$), and indeed diverges. Is this the case in general?

3

There are 3 best solutions below

1
On BEST ANSWER

The statement is true.

Given $g(x)$ everywhere differentiable and $f(x)$ continuous real to real functions defined for all $x$ real numbers. If $f(x_i)\neq 0$, $g'(x_i)=0$ for any $x_i$ that satisfies $g(x_i)=0$ then $\int_{-\infty}^{\infty} f(x) \delta(g(x)) dx $ diverges. The converse is true in a restricted sense: if $f(x_i)\neq 0$, $g'(x_i)\neq 0$ for all $x_i$ that satisfies $g(x_i)=0$ and if there are only finitely many such $x_i$ points then $\int_{-\infty}^{\infty} f(x) \delta(g(x)) dx $ is finite.

The support of the delta function is the infinitesimal neighborhood around $0$, thus we can restrict the integral to the open sets around $x_i$, label them $U_i$.

Consider first the case where $g(.)$ is monotonic around $x_i$. Then there exists an inverse function of $g$ for which $G_i(g(x)) = x$ for $x\in U_i$. Furthermore the image of the open set by a continuous function is an open set, label it $V_i = g[U_i]$.

Then you can write $$\int_{-\infty}^{\infty} f(x) \delta(g(x)) dx =\sum_i \int_{U_i} f(x) \delta(g(x)) dx =\sum_i \int_{V_i} f(G_i(g)) \delta(g) \frac{dG_i}{dg} dg\,.$$ Let us denote $G'_i = \frac{dG_i}{dg}$. From the definition of the delta function $$\sum_i \int_{V_i} f(G_i(g)) G'_i(g) \delta(g) dg = \sum_i f(G_i(0)) G'_i(0)$$.

Thus if $f(x_i) = f(G_i(0)) \neq 0$ then this is finite iff $G'_i(0)$ is finite. However $G'_i(0) = 1/g'(x_i)$ from the inverse function theorem, therefore the expression is finite exactly if $g'(x_i)\neq 0$.

The case where $g(.)$ has a local maximum or minimum at some $x_i$ is a bit more subtle. In this case, there is no inverse, and the previous argument is not applicable. Let us consider the case where $g(.)$ is at least $n$ times differentiable, such that if it has a minimum, $g(x) \rightarrow a x^{2n}$ in an infinitesimal neighborhood $U_i$, where $a> 0$ real and $n\geq 1$ integer. The delta function can be represented formally as the limit of the series $\phi(x) = (2\pi \sigma^2)^{-1/2}\exp(-x^2/2\sigma^2)$ where $\sigma\rightarrow 0$. Then

$$\int_{U_i} f(x) \delta(g(x)) dx = \lim_{\sigma\rightarrow 0}\int_{U_i-x_i}\frac{1}{\sqrt{2\pi \sigma^2}} f(x_i + z)\exp(-z^{4n}/2\sigma^2) dz\,. $$

Since $f(.)$ is continuous you can always find a $c$ in a sufficiently small neighborhood that $f(x_i)+c>f(x)> f(x_i)-c$. Thus we can give a upper/lower bound on the integral by replacing $f(x_i+z)$ with $f(x_i)\pm c$, which you can take out in front of the integral. Now you can show that $$\frac{1}{\sqrt{2\pi \sigma^2}} \int_{-\infty}^{\infty} \exp(-x^{2n}/2\sigma^{2}) dx =\sigma^{(1-n)/n}\times\frac{1}{\sqrt{2\pi}} \int_{-\infty}^{\infty} \exp(-y^{2n}/2) dy $$ The second term is finite in the limit $\sigma \rightarrow 0$, so this is asymptotically proportional to $\sigma^{(1-n)/n}$. In the limit $\sigma\rightarrow 0$, this goes to infinity. Since, the lower bound of the integral goes to infinity, the integral is divergent.

4
On

Indeed this formula doesn't hold if $g'(x_i) = 0$. In fact this formula is only true if $g$ is a continuously differentiable function with $g'$ nowhere zero.

You could also say that $g$ has simple zeros which implies that the function converges to zero linearly.

If $g'(x_i)$ is zero then $g(x_i)$ is not a simple zero and therefore converges to zero quicker than linearly.

My guess (although I can't prove it) is that the formula is finite because the delta converges to infinity "as quickly" as $dx$ converges to $0$ creating something finite. From my reasoning if $g'(x_i) = 0$ then the integral will diverge at that point as you said.

Please tell me if anything doesn't make sense. From my point of view the dirac delta function is not well enough defined argue correctly but then there might be things I don't know about it. It's a function from the dark arts.

5
On

You can tame these $\delta$'s by simple subsitution. Your first example:

$$\int_{-\pi/2}^{\pi/2}e^x \delta\left(\sin(x)\right)dx$$

Put $y=\sin x$, so $dy=\cos x dx$ and $dx = dy/\sqrt{1-y^2}$. The integral becomes:

$$\int_{-1}^{1}\frac{e^{sin^{-1}y}}{\sqrt{1-y^2}} \delta(y)dy$$

This is a bit messy, but it doesn't matter, because it is just equal to the value of the integrand at $y=0$, which is $1$.

Your second example:

$$\int_{-\pi/2}^{\pi/2}e^x \delta\left(\sin^3(x)\right)dx$$

Put $y=\sin^3 x$, so $dy=3\sin^2 x \cos x dx$ and $dx = dy/(3\sin^2 x \cos x) = dy/\left(3y^{\frac23}\sqrt{1-y^{\frac23}}\right)$. The integral becomes:

$$\int_{-1}^{1}\frac{e^{sin^{-1}(y^{\frac13})}}{3y^{\frac23}\sqrt{1-y^{\frac23}}} \delta(y)dy$$

But now the value of the integrand at $y=0$ is infinite, because of that $y^{\frac23}$ term. So the integral diverges.