Derivative of the test function.

597 Views Asked by At

For any real number $T$, choose special test function, $f(t)=-1$ when $t\in[0,T]$, $f(t)=0$ when $t\not\in[0,T]$. My book say $f'(t)=\delta(t-T)-\delta(t)$ but I don't know why. In my opinion, it might be $0$ since it is constant function for each interval. How can I calculate it?

EDIT: I roughly feel it since definition of derivative says its slopes approaches to $-\infty$ when $0$ and $T$(may be $\infty$ when $T$, intuitively). But I can't prove and accept clearly.

1

There are 1 best solutions below

0
On BEST ANSWER

Given a distribution $u$ the derivative of it is defined by $$ \langle u', \varphi \rangle = - \langle u, \varphi' \rangle $$ for all test functions $\varphi.$

Therefore, take a test function $\varphi$ and rewrite $$ - \int_{-\infty}^{\infty} f(t) \, \varphi'(t) \, dt $$ so that there are only $\varphi$, no $\varphi'.$ Useful is to insert the definition of $f$ and then integrate by parts.

Give it a try!