I have a Gaussian source $X \sim N(\mu, \Sigma)$, and under squared error fidelity choice $E[(X-Y)'(X-Y)]$, my optimal output $Y$ differs from $X$ by independent error $Z$, where $Z \sim N(\eta, \Theta)$. Because $Z$ is Normal, $Y$ must be Normal too. So we have (I think this is called backward model in communication theory, but I'm not sure) $$X = Y + Z$$
with each of the above Normally distributed. If the mutual information between source and transmitted output, $I(X,Y)$, equals some number $m$, then this is going to be the entropy of $Y$: $h(Y)=m$. Finally, from mutual information equality $$I(X,Y)=h(X)-h(X|Y)=h(X)-h(Z)$$
using formulas for entropies of Gaussians I get $$m=\frac{1}{2}\ln(\frac{\det(\Sigma)}{\det(\Theta)})$$
However, when I calculate $\Theta$ from this equality, and then infer variance of $Y$ as $V=\Sigma-\Theta$, I discover that the entropy of a Normal given by this variance $V$ is not equal to $m$ as I expected. Where am I wrong here?
Ok, answering my own question was not my original intention, but looks like I have some kind of a conjecture, so will post it as an answer and see if others will support or criticize it.
At first I was confused by my numerical experiments and thought I'm making some stupid coding error, but this post https://stats.stackexchange.com/questions/66092/entropy-of-sum-of-gaussians-versus-sum-of-entropy-of-gaussians has convinced me that maybe this is not an error but correct (albeit undesirable) result.
So, because of the issues with differential entropy (used in the case of Gaussians) such as absence of invariance under reparameterization, possibility of negative values etc., the entropy $h(Y)$ is not supposed to equal mutual information $I(X,Y)$. Using $\Sigma$ and $m$ is fine to get $\Theta$. Using $\Theta$ with $\Sigma$ does give correct $V=\Sigma-\Theta$. However, even though in this case $I(X,Y)=m$ will end up being equal to $h(X)-h(Z)$, the entropy $h(Y)$ doesn't have to be equal to $m$.
My own informal way to convince myself that this is not crazy is that we know variances are well-behaved, so for independent $Y$ and $Z$ sum of their variances $V + \Theta$ must be equal variance $\Sigma$. An equally intuitive result doesn't hold for entropies, i.e. $h(X) \ne h(Y) + h(Z)$, but this is explained by the fact that differential entropies are not well-behaved.