$ p(x)\in\mathbb{R[X]} $ is a polynomial of degree $n$ with no real roots. Show that: $$\int\limits_{-\infty}^{+\infty}\dfrac{(p'(x))^2}{(p'(x))^2+(p(x))^2}\,dx \leq n^{3/2}\pi.$$
It's easy to see that the degree of $ p$ has to be even.
For $n=2$ this integral is at most $2\pi$.
For $n>2$ the maximum value of this integral is obtained when all the imaginary parts of the roots of $p(x)$ tend to $0$, but I couldn't go further.
Any help would be appreciated, thanks.
Edits by David Speyer: It seems very likely now that the optimum bound is $n \pi$, not $n^{3/2} \pi$. As pointed out in the comments below, and further in 23rd's question, this is the value we get in the limit where $p$ has double roots on the real axis.
It seems likely that moving the roots of $p$ away from the real axis can only make the integral less. Write the roots of $p$ as $a_j \pm i b_j$, so $p(x) = \prod ((x-a_j)^2 + b_j^2)$ and $$\frac{p'(x)}{p(x)} = \sum \frac{2 (x-a_j)}{(x-a_j)^2 + b_j^2}.\quad (\ast)$$ So making the $b_j$ larger tends to make $p'(x)/p(x)$ smaller, which makes the integral smaller. But this argument is not rigorous, because the terms of $(\ast)$ can have both positive and negative sign, so it could be that making the individual terms closer to $0$ makes the absolute value of $(\ast)$ larger. I don't see how to beat this issue easily.
Thus, I'm putting up a bounty for proving or disproving
$$ \int_{-\infty}^{+\infty}\dfrac{(p'(x))^2}{(p'(x))^2+(p(x))^2}\,dx \leq n\pi.$$

Without loss of generality, we may assume that $p$ is monic. Since $p$ has no real roots, $n=2m$ for some $m\ge 1$, and there exist quadratic monic polynomials $q_1,\dots,q_m$ with no real roots such that $p=\prod_{k=1}^m q_k$. Therefore, by Cauchy-Schwarz inequality,
$$ \left(\frac{p'}{p}\right)^2=\left(\sum_{k=1}^m \frac{q_k'}{q_k}\right)^2\le m\cdot \sum_{k=1}^m \left(\frac{q_k'}{q_k}\right)^2. \tag{1}$$ Denote $f(t):=\frac{t}{1+t}$ for $t\ge 0$. Note that $f$ is increasing and $f(s+t)\le f(s)+f(t)$. Then from $(1)$ it follows that
$$\frac{p'^2}{p'^2+p^2}=f\left(\left(\frac{p'}{p}\right)^2\right)\le \sum_{k=1}^m f\left(m\cdot\left(\frac{q_k'}{q_k}\right)^2\right). \tag{2}$$ For each $k$, since $q_k$ has no real roots, there exist $a_k\in\mathbb R$ and $c_k>0$ such that $q_k(x)=(x-a_k)^2+c_k$. Therefore,
$$\int_{-\infty}^{+\infty}f\left(m\cdot\left(\frac{q_k'(x)}{q_k(x)}\right)^2\right)dx\le \int_{-\infty}^{+\infty} f\left( \frac{4m}{(x-a_k)^2}\right)dx=2\sqrt{m}\cdot\pi. \tag{3}$$ Combining $(2)$ and $(3)$, we obtain that $$\int_{-\infty}^{+\infty}\frac{p'^2(x)}{p'^2(x)+p^2(x)}dx\le 2^{-\frac{1}{2}}\cdot n^{\frac{3}{2}}\cdot\pi.$$