The context of this question is variational inference (vriational Bayes) assuming factoring posterior distributions, which is then also known as mean field theory. However the argument is purely algebraically.
Let $q(\theta)$ be a function where $\theta$ a parameter vector. Assume $$q(\theta)=\prod_{i=1}^D q_i(\theta_i),$$ where $\int q_i(\theta_i) d \theta_i=1$. Then another function is given by $$\int q(\theta) \ln q(\theta) d\theta$$ which, under the assumption made above, apparently factorizes to
$$\sum_{i=1}^D \int q_i(\theta_i) \ln q_i(\theta_i)d\theta_i.$$
I do not understand why. I can see that
$$\prod_{i=1}^D \int q_i(\theta_i) \sum_{j=1}^D \ln q_j(\theta_j) d\theta_i$$
but I do not know how to get to the result from here.
Let's expand the log first, so that the sum comes outside: $$ \int q(\theta) \log{q(\theta)} \, d\theta = \int q(\theta) \sum_i \log{q_i(\theta_i)} \, d\theta = \sum_i \int q(\theta) \log{q_i(\theta_i)} \, d\theta. $$ Now expand $q(\theta)$, $$ \int q(\theta) \log{q_i(\theta_i)} \, d\theta = \int \left(\prod_j q_j(\theta_j) \right) \log{q_i(\theta_i)} \, d\theta. $$ Separate off the $q_i(\theta_i)$ from the product, and divide the integral up: $$ \int \left(\prod_j q_j(\theta_j) \right) \log{q_i(\theta_i)} \, d\theta = \int \left(\prod_{j\neq i} q_j(\theta_j) \right) q_i(\theta_i) \log{q_i(\theta_i)} \, d\theta \\ = \left( \int q_i(\theta_i) \log{q_i(\theta_i)} \, d\theta_i \right) \left( \prod_{j \neq i} \int q_j(\theta_j) \, d\theta_j \right). $$ The integrals in the second bracket are all $1$, and we get the answer.