I want to calculate the distributional Fourier transform of $u(x) = \frac{x}{1+x^2}$ in one dimension in the distributional sense as $u\notin L^1$.
I use the distributional definition of the Fourier transform (denoted with $\mathcal{F}$). In the following, I will use the well known formula for $\mathcal{F}(\frac{1}{x^2+1})(\xi) = \pi e^{-|\xi|}$ as well as the formula $\mathcal{F}( x u ) = (i\partial) \,\mathcal{F}(u)$. Note that the latter formula also yields for distributions. For $f\in L^1$ functions, I use the convention
$$\mathcal{F}(f) (\xi) = \int_{\mathbb{R}} f(x) e^{-i x \xi } \,\text{d}x. $$
This gives that $$\mathcal{F}(u) (\xi)=\mathcal{F}(x \frac{1}{x^2+1}) (\xi)= (i\partial_\xi) \pi e^{-|\xi|} = -i\pi e^{-|\xi|} \text{sgn}(\xi),$$ which is correct up to a minus (See Wolfram Alpha). Where is my mistake? Indeed WolframAlpha uses the same sign convention as I do. (neglecting factors of $2\pi$).
Your computation is correct!
Despite what WA itself is saying, it doesn't use the definition on MathWorld, but the definition in Mathematica, which employs an unexpected sign convention.
From the Mathematica documentation of the FourierTransform command: