I want to express a distribution, $f(x)$, that has the following properties: $$\int_{-\infty}^x f(x') \operatorname{d}x' = \left\{\begin{array}{cc} 0 & x \le 0 \\ \frac{1}{x} & x > 0. \end{array}\right. = \frac{\Theta(x)}{x},$$ where $\Theta(x)$ is a unit step function. Formally, I would guess that I could just take the derivative using the standard rules to get: $$f(x) = -\frac{\Theta(x)}{x^2} + \frac{\delta(x)}{x}.$$
The problem I have is that integrating the second equation above to recover the required property runs into an apparent $\infty - \infty$ ambiguity. Is this ambiguity handled in distribution theory in some particular way? Is this, perhaps, related in some way to the Cauchy principle value, for example?
Right now, my best guess for how to write $f$ to make it unambiguously have the needed property is to write it as: $$f(x) = \lim_{s\rightarrow 0}\left(-\frac{1}{x^2} + \left[\frac{1}{x}\right] \frac{\partial}{\partial x} \right) \sigma(x,s), $$ where $\sigma(x,s)$ is a signmoid that satisfies $\lim_{x\rightarrow 0^+} \frac{\sigma(x,s)}{x^2} = 0$, $\sigma(x,s)=0$ if $x \le 0$, $\lim_{x\rightarrow \infty} \sigma(x,s) = 1$, $\lim_{s\rightarrow 0^+} \sigma(x,s) = \Theta(x)$, and the limit is understood, formally, to be delayed until after an integral is taken, the same way many think about the delta function. For example: $$\sigma(x,s) = \frac{1}{2}\operatorname{erfc}\left(-\frac{\ln(x/s)}{s\sqrt{2}}\right) \Theta(x),$$ where $\mathrm{erfc}$ is the the standard complimentary error function.
Let $$\ln_+(x) = \begin{cases}\ln x, & x>0 \\ 0, & x\leq 0\end{cases}$$ This is in $L^1(\mathbb R)$ and thus a distribution. The derivative of it obviously satisfies $$\ln_+'(x) = \begin{cases}\frac1x, & x>0 \\ 0, & x\leq 0\end{cases}$$ so you can take $$f = \ln_+''.$$
I understand that you want it expressed in a more explicit form, but that's not always possible.