Context: Statistical Inference and Differential Geometry
Let's consider a generic $ p(x;\theta) $ distribution with $ \theta $ Parameters Vector, it is obvious that $$ \int p(x; \theta) dx = 1 $$
and by definition it should be exactly 1 for any value of $ \theta $ hence $$ \frac{\partial}{\partial \theta_{i}} \int p(x; \theta) = 0 $$
Then now let's consider the negative entropy of the distribution (the expected value of the log distribution) $$ S = E_{\theta}\left [ \ln(p(x; \theta)) \right ] = \int \ln(p(x;\theta)) p(x; \theta) dx $$
applying the derivative internally we should ge $$ E_{\theta}\left [ \frac{\partial}{\partial \theta_{i}} \ln(p(x; \theta)) \right ] = \int \frac{\partial}{\partial \theta_{i}} \ln(p(x; \theta)) p(x; \theta) dx = \int \frac{\partial}{\partial \theta_{i}} p(x; \theta) \quad \forall i $$
Now if it would be possible to move freely the derivative in and out the of the integral it will be possible to see that $$ E_{\theta}\left [ \frac{\partial}{\partial \theta_{i}} \ln(p(x; \theta)) \right ] = \frac{\partial}{\partial \theta_{i}} E_{\theta} \left [ \ln(p(x; \theta)) \right ] $$
hence the (negative) entropy seems constant with respect to the parameters but this does not seem correct to me: let's consider for example the gaussian distribution and it is easy to observe that its entropy depends on the variance.
Maybe the problem is that the above mentioned parametrization is different or does it depend on the possibility to move the derivative freely in and out the integral of the expected value ?
The error is here:
$$ E_{\theta}\left [ \frac{\partial}{\partial \theta_{i}} \ln(p(x; \theta)) \right ] = \frac{\partial}{\partial \theta_{i}} E_{\theta} \left [ \ln(p(x; \theta)) \right ] $$
This is incorrect as the density depends on $\theta$: when writing the derivative there is one additional term $$ E_\theta\left[\ln p(x;\theta) \frac{\partial p(x;\theta)}{\partial\theta}\right] $$