ODEs in the space of distributions $\mathcal{D}'(a,b)$: some general solution technique.

417 Views Asked by At

I am studying ordinary differential equations in the space of distribution $\mathcal{D}'$ where $\mathcal{D}$ is the space of bump (aka test) functions.
The equations we treat are linear, although not always with constant coefficients. What my instructor does to solve this equations is the following

  1. Either resorting directly to the lemma which states that $T'=0\ in\ \mathcal{D}'\Rightarrow T=constant$, in the simplest cases
  2. In less trivial cases, such as, say $u'+a(x)u=0$ with $a(x)\in C^\infty$, she exploits some kind of 'suggestions' from the classical solution and, given some $\phi\in\mathcal{D}$ she lets the derivative acts on a term $e^{-A(x)}\phi$, instead of on $\phi$ alone, where $A(x)$ is a primitive of $a(x)$: hence $\phi$ multiplied by the classical solution of the equation. This leads to a convenient rewriting of the equation which allows to exploit the lemma of point 1.

I can smell the idea of the tecnique, but since this is done without justification, I would like to know if this is just an euristic or a general approach. We pretty soon moved to partial differential equations, where through the fundamental solution of an operator and convolution one does get a general approach so any help, suggestion or explanation would be great.

1

There are 1 best solutions below

0
On BEST ANSWER

Of course these techniques have to be justified.

Assume that we have a distributional differential equation $u' + A'u = v,$ where $u$ is an unknown distribution, $A$ is a given $C^\infty$ function, and $v$ is a given distribution.

From the classical theory, we get the idea of multiplying $u' + A'u$ with the integrating factor $e^A$. This part is no problem; also in distribution theory it is valid that $e^A (u' + A'u) = (e^A u)'$.

But will multiplication with $e^A$ preserve the set of solutions? Is $e^A w = 0$ equivalent with $w = 0$ for every distribution $w$? Indeed it is, and the reason for this is that $e^{-A} \in C^\infty$ since $e^A \neq 0$ everywhere:

Let $\varphi \in C_c^\infty$. If $w=0$ then it is clear that $e^A w=0$ since $\langle e^A w, \varphi \rangle = \langle w, e^A \varphi \rangle = 0$. On the other hand, if $e^A w = 0$ then $\langle w, \varphi \rangle = \langle w, e^A e^{-A} \varphi \rangle = \langle e^A w, e^{-A} \varphi \rangle = 0,$ since $e^{-A} \varphi \in C_c^\infty$.

Thus, multiplying $u'+A'u=v$ with $e^A$ doesn't change the set of solutions. Therefore $e^A(u'+A'u)=e^A v$ is equivalent to the initial equation. But this is equivalent with $(e^A u)' = e^A v$. Now we only have to find all primitive distributions $w$ for $e^A v$ to get $e^A u = w,$ and then multiply with the nonzero factor $e^{-A}$ to get $u = e^{-A} w.$