Leibniz integral rule for distributions

847 Views Asked by At

I have the following expression

$$ \frac{d}{d\alpha} \int_{[0, 1]} f([x > \alpha] + c [x = a]) dx $$

For some smooth $f$. If I interchange the differentiation and the integration, I get

$$ \begin{align} \int_{[0, 1]} &\frac{d}{d\alpha} f([x > \alpha] + c [x = a]) dx \\ &= \int_{[0, 1]} f'([x > \alpha] + c [x = a]) \frac{d}{d\alpha} ([x > \alpha] + c [x = a]) dx \\ &= \int_{[0, 1]} f'([x > \alpha] + c [x = a]) \frac{d}{d\alpha} (H(x - \alpha) + c [x = a]) dx \\ &= -\int_{[0, 1]} f'([x > \alpha] + c [x = a]) \delta(x - \alpha) dx \\ &= - f'(c) \end{align} $$

So if the interchange was valid, that'd mean the derivative is constant, and evaluated at an arbitrary point of $f'$, which does not look right.

So my questions are:

  1. What exactly is wrong in this line of reasoning?
  2. When I interchange the integration and the differentiation, the "functions" I'm dealing with inside of the integral are not regular functions anymore, they're generalized functions / distributions. Is there any theorem that'd justify such "Leibniz integral rule" for distributions?

    That is, the outer integral is a smooth function of $\alpha$, but I want to propagate the differentiation operator inside of the integral, deal with the distributions there, and then evaluate the integral to get back to regular functions.

2

There are 2 best solutions below

0
On

A correct treatment

I will here assume that $a$ and $\alpha$ should be the same symbol, say $\alpha$.

First taking derivative, then integrating

First note that we can write $$ f_\alpha(x) := f([x > \alpha] + c [x = \alpha]) = \begin{cases} f(0) & x < \alpha \\ f(c) & x = \alpha \\ f(1) & x > \alpha \end{cases} $$ i.e. $$f_\alpha(x) = f(0) \, [x<\alpha] + f(c) \, [x=\alpha] + f(1) \, [x>\alpha]$$

Taking the derivative with respect to $\alpha$ gives $$ \frac{\partial}{\partial\alpha} f_\alpha(x) = f(0) \, \delta(\alpha-x) + 0 + f(1) \, \left(-\delta(\alpha-x)\right) = \left( f(0) - f(1) \right) \delta(\alpha-x)$$ since a value at a single point doesn't matter for integration (e.g. when a function is treated as a distribution).

Integrating gives $$ \int_{[0,1]} \frac{\partial}{\partial\alpha} f_\alpha(x) \, dx = \int_{[0,1]} \left( f(0) - f(1) \right) \delta(\alpha-x) \, dx = \begin{cases} 0 & \alpha < 0 \\ f(0) - f(1) & 0 \leq \alpha \leq 1 \\ 0 & \alpha > 1 \end{cases} $$ although one should be careful integrating $\delta(\alpha-x)$ for $\alpha \in \partial[0, 1]$, i.e. for $\alpha=0$ and $\alpha=1$. At those points the values should rather be considered as undefined.

First integrating, then taking derivative

Since integration doesn't care about values at isolated points, $$ I(\alpha) := \int_{[0, 1]} f([x > \alpha] + c [x = \alpha]) \, dx = \int_{[0, 1]} f([x > \alpha]) \, dx \\ $$

Expanding the last expression we get $$\begin{align} I(\alpha) & = \int_{[0, 1]} \left( f(0) [x \leq \alpha] + f(1) [x > \alpha]) \right) \, dx \\ & = \int \left( f(0) [x \leq \alpha][0 \leq x \leq 1] + f(1) [x > \alpha][0 \leq x \leq 1]) \right) \, dx \\ & = f(0) \, m\left( (-\infty, \alpha] \cap [0, 1] \right) + f(1) \, m\left( (\alpha, \infty) \cap [0, 1] \right) \end{align}$$ where $m(A)$ is the Lebesgue measure (i.e. length) of $A$.

If $\alpha < 0$ then $(-\infty, \alpha] \cap [0, 1] = \emptyset$ and $(\alpha, \infty) \cap [0, 1] = [0, 1]$ so $I(\alpha) = f(1).$

If $\alpha > 1$ then $(-\infty, \alpha] \cap [0, 1] = [0, 1]$ and $(\alpha, \infty) \cap [0, 1] = \emptyset$ so $I(\alpha) = f(0).$

If $0 \leq \alpha \leq 1$ then $(-\infty, \alpha] \cap [0, 1] = [0, \alpha]$ and $(\alpha, \infty) \cap [0, 1] = (\alpha, 1]$ so $I(\alpha) = f(0) \, \alpha + f(1) \, (1-\alpha).$

Summarizing, this can be written $$I(\alpha) = \begin{cases} f(1) & \alpha < 0 \\ f(0) \, \alpha - f(1) \, (1-\alpha) & 0 < \alpha < 1 \\ f(0) & \alpha > 1 \end{cases}$$ so $$I'(\alpha) = \begin{cases} 0 & \alpha < 0 \\ f(0) - f(1) & 0 < \alpha < 1 \\ 0 & \alpha > 1 \end{cases}$$ with undefined values for $\alpha = 0$ and $\alpha = 1$.

1
On

The incorrect treatment

Let us study a simpler case. Given $f$, let $$g(x) = f(H(x)) = \begin{cases}f(0) & x<0 \\ f(1) & x>0\end{cases}$$

The correct derivative as a distribution is $$g'(x) = \left( f(1) - f(0) \right) \, \delta(x)$$

The chain rule formally gives $$g'(x) = f'(H(x)) \, H'(x) = f'(H(x)) \, \delta(x)$$ Here $$f'(H(x)) = \begin{cases}f'(0) & x<0 \\ f'(1) & x>0 \end{cases}$$ Thus we have step function multiplied with the Dirac distribution, and they are singular at the same point! Then the multiplication is not allowed.

Also, the chain rule says that $g'(x) = f'(H(x)) \, H'(x)$ for those $x$ where both $f'(H(x))$ and $H'(x)$ are defined as derivatives of functions. This condition is valid for $x \neq 0$, where we correctly get $g'(x) = 0$. At $x = 0$ we cannot use the chain rule.