I came around the following exercise
Suppose that $\Lambda \in D' (A)$, where $A$ is a real open interval, satisfies the differential equation
$\Lambda' + g \Lambda = f$
with $g \in \mathcal{C}^{\infty}(A)$. Show that if $f \in \mathcal{C}(A)$ then $\Lambda$ is a $\mathcal{C}^{1}$ function.
From what I understand, saying that a distribuion $\Lambda$ is a function means that exists a certain function $f$ such that for all thest functions $\phi$ $\Lambda = \int f \cdot \phi$.
So in the given case the differential equation reads
$\Lambda' (\phi) + g \Lambda (\phi) = \int_{A} f \cdot \phi$
That I can rewrite as
$\Lambda (g \phi - \phi') = \int_{A} f \cdot \phi$
Here I'm tottaly lost and have no idea how to proceed.
I'll preface this by saying that I know no distribution theory and googled all the results I'm using, but hopefully there are no mistakes, and maybe there's a more efficient way of doing this.
Recall the product rule for distribution, which gives us that
$$(h\Lambda)'=h'\Lambda+h\Lambda'$$
for $h\in\mathcal{C}^\infty(A)$. In particular, taking $h(x)=e^{\int_{x_0}^xg(\xi)\,\mathrm{d}\xi}$ with $x_0\in A$ we get that
$$(h\Lambda)'=hg\Lambda+h\Lambda'=h(\Lambda'+g\Lambda)=hf.$$
Consequently
$$(h\Lambda)'=hf.$$
As
$$h(x)f(x)=\frac{\mathrm{d}}{\mathrm{d}x}\underbrace{\int_{x_0}^xh(x)f(x)\,\mathrm{d}x}_{=k(x)}=k'(x)$$
with $x_0\in A$ we have that
$$(h\Lambda-k)'=0.$$
As a distribution's derivative being zero implies that it is constant (as a distribution), we get that
$$h\Lambda=k+c$$
for some $c\in\mathbb{R}$. Consequently
$$\Lambda=\frac{k+c}{h},$$
i.e.
$$\Lambda(\phi)=\int_A\tau(x)\phi(x)\,\mathrm{d}x$$
with $\tau(x)=\frac{k(x)+c}{h(x)}$. We can easily check that $\tau$ is $\mathcal{C}^1$ as it has derivative given by
$$\tau'(x)=\frac{k'(x)h(x)-(k(x)+c)h'(x)}{h(x)^2}=f(x)-\frac{(k(x)+c)h'(x)}{h(x)^2},$$
which is clearly continuous (just check how $k$ snd $h$ were defined). The result follows.