Conditional entropy of a multivariate normal distribution

390 Views Asked by At

Say I have a multivariate normal distribution as in here, I know the following:

  • $x_1$ conditioned on $x_2=a$ is also normally distributed with a new covariance matrix that is independent of $a$.
  • The Entropy of a normally distributed variable is a function of the determinant of its covariance (and not its mean).
  • Conditioning cannot increase Entropy in expectation, i.e. $\mathcal H(x_1\vert x_2)= \mathbb{E}_{x_2}\left(H(x_1\vert x_2=a)\right) \leq \mathcal H(x_1) $

Is it possible to conclude that for an N-dimensional $x$ partitioned as such, conditioning never increases Entropy, even for each specific realization of $x_2$?

1

There are 1 best solutions below

2
On BEST ANSWER

As you've said, $x_1$ conditioned on $x_2=a$ is normally distributed with a constant covariance matrix. Hence $h(x_1 | x_2 = a)$ is the differential entropy of that (conditioned) normal, hence it's constant (does not depend on $a$).

Hence $$h(x_1| x_2 ) = \int f_{x_2}(a) h(x_1 | x_2 = a) \, da = h(x_1 | x_2 = a) $$

and then $h(x_1 | x_2 = a) \le h(x_1)$ for any $a$. Of course, this is not true in general.