I am currently studying David MacKay's Information Theory, Inference, and Learning Algorithms.
In chapter 20, where he talks about modify kmean algorithm into a soft kmean algorithm (Gaussian Mixture Model).
In the equation, P(x) is the gaussian distributioin, r(x) stands for the probability of a point belong to an arbitrary cluster.
The reduced form of $\frac {\int dx P(x) x r(x)}{\int dx P(x) r(x)}$ is $2\int dx P(x) x r(x)$
This implies the bottom will equal to $\frac{1}{2}$.
I know that $\int dx P(x) = 1$, but I don't understand how to get rid of $r(x)$.
Or there are other ways to integrate the equation?
Here is the original question and solution:

Let $a = \beta m$. If we differentiate the integral with respect to $a$, we get $$ \int_{-\infty}^\infty \frac{2x P(x)e^{-2a x}}{(1+e^{-2ax})^2}dx= 2\int_{-\infty}^\infty \frac{x P(x)}{[e^{ax}(1+e^{-2ax})^2]}dx = \sqrt{\frac{2}{\pi}}\int_{-\infty}^\infty \frac{x e^{-x^2/2}}{(e^{ax}+e^{-ax})^2}dx = 0, $$ where in the last step we used the fact that the integrand is clearly odd. Thus, the integral is constant with respect to $a$, and in particular is equal to its value when $a=0$. So $$ \int_{-\infty}^\infty \frac{P(x)}{1+e^{-2\beta m x}}dx = \int_{-\infty}^\infty \frac{P(x)}{1+e^{0}}dx = \frac{1}{2}\int_{-\infty}^\infty P(x)dx = \frac{1}{2}. $$
Our good friends at WolframAlpha confirm that this is the case.
As for the other integral, it is possible to show using integration by parts that $$ \int_{-\infty}^\infty \frac{2xP(x)}{1+\exp(-2\beta m x)}dx = \int_{-\infty}^\infty P\left(\frac{u}{\beta m}\right)\,\mathrm{sech}^2(u) du, $$ but neither of these integrals appear in any tables I can find.