I encountered a putative contradiction. Assume we have two 2-dim. Gaussian variables $z_1 = (x_1, y_1)$ and $z_2 = (x_2, y_2)$ with all components being independent, normal distributed variables: $x_1,y_1= N(0,\sigma_1)$ and $x_2,y_2= N(0,\sigma_2)$.
Now, I want to calculate the difference of the entropies of the vectors $D=H_1-H_2$. Considering independent Gaussian variables the entropies of the components, $H_{x,y} = \frac{1}{2}\log(2\sigma^2\pi e)$, simply add up and I get $H_1 = 2\cdot \frac{1}{2}\log(2\sigma_1^2\pi e)$ and $H_1 = 2\cdot \frac{1}{2}\log(2\sigma_2^2\pi e)$. Therefore, $D = 2\log(\sigma_1/\sigma_2)$.
On the other hand, decomposing the vectors in polar coordinates (length $r$ and phase $\phi$, again independent) one finds that $r$ follows a Rayleigh distribution and $\phi$ a uniform distribution. Hence, $H_r = \frac{1}{2}\log(\frac{1}{2}\sigma^2e^{2+\Gamma})$ and $H_\phi = \log(2\pi)$. Now, the difference of the entropies is $D=\frac{1}{2}\log(\frac{1}{2}\sigma_1^2e^{2+\Gamma})-\frac{1}{2}\log(\frac{1}{2}\sigma_2^2e^{2+\Gamma})+\log(2\pi)-\log(2\pi) = \log(\sigma_1/\sigma_2)$.
Can anyone tell me how the difference of a factor 2 comes about?
The difference of the entropies corresponds to a mutual information (in my special case) and should be independent of the coordinate system. Anyway, the difference of entropies should be invariant under the chosen basis.
(The entropies of Normal, Uniform and Rayleigh distributions I got from wiki.)