prove a theorem about an upper bound of entropy of a random vector

455 Views Asked by At

There is a theorem that:

if Z is any zero-mean, complex random vector with covariance $E[ZZ^H]=R_z$, then $H(Z)\leq \log|{\pi eR_z}|$, with equality holding if and only if Z has a circularly symmetric Gaussian distribution and $H(Z)$ is the entropy of Z.

I do not know what the name of this theorem to seek its proof in books or articles. I will be thankful if someone tell its name to me, or offers a proof for this theorem.

1

There are 1 best solutions below

0
On BEST ANSWER

This theorem has not a specific name, but a proof for this has been offered in the following article on page 4:

http://lthiwww.epfl.ch/~leveque/Projects/telatar.pdf