I've got a problem with something. Actually, let $f : I \rightarrow \mathbb{R}$ a function, continuous on $I$ ($I$ interval of $\mathbb{R}$). We know that :
$F(x) = \int_{a}^{x} f(t) \, \mathrm{d}t$ verify $F'(x) = f(x)$ for all $x,a \in I$. But, now, we have also :
$F(x) = \int_{\mathbb{R}}^{} 1_{[a;x]} \times f(t) \, \mathrm{d}\lambda(t)$ for all $x \in I$. And now, using the Leibniz integral rule, as we have :
- $\forall x \in I \; \; t \rightarrow 1_{[a;x]}f(t)$ is Lebesgue-integrable as $f$ is continuous, so bounded, on $[a;x]$ ;
- The derivative of $x \rightarrow 1_{[a;x]} \times f(t)$ exists for almost all $t \in \mathbb{R}$ and is equal to $0$ (cause the function is constant on $[a;x]$ and on $\ ]-\infty;a[ \cup ]x;+\infty[$) (we suppose x>a but same if a>x)
So, we should have :
$F'(x) = \int_{\mathbb{R}}^{} 0 \, \mathrm{d}t = 0$, which is wrong cause we know that $F'(x) = f(x)$. But where is the mistake ?
Thank you !