I'm trying to solve the following equation where $u:\mathbb{R} \rightarrow \mathbb{R}$, $\lambda>0$ and $\delta$ is the Dirac function:
$$\int_{-\infty}^{\infty} dy \frac{u(y)-u(x)}{(y-x)^2} + u(x)^2 = \lambda \delta(x) $$
I tried to expand the solution in $\lambda$: $$u(x)=\lambda u_1(x) + \lambda^2 u_2(x) + ...$$ and go in Fourier space where the equation reads ($*$ denotes the convolution product):
$$-|q| u(q) + \frac{1}{2\pi}(u*u)(q) = \lambda$$.
The first order term $u_1(q)=\frac{1}{|q|}$ has not a definite Inverse fourier transform but still Mathematica yields $u_1(x) = -(2\ln(|x|+2\gamma)$, $\gamma$ being Euler constant (I don't know how though). But starting from the second order terms are divergent.
I search on the web but did not found any similar equation.
I would be thankful for any hints, new idea to try or link to similar equations