I have a random complex column vector $\mathbb{x}$ of length $L$ which has circularly symmetric complex gaussian probability density function with mean $0$ and covariance matrix $\sigma^2 \mathbb{I}$ where $\mathbb{I}$ is the identity matrix of size $L$. I have read that for such random vector the differential entropy is given as $$H(\mathbb{x})=\log_2 \det(\pi e\sigma^2\mathbb{I}).$$ I the formula for finding the entropy is as follows $$H(\mathbb{x})=-\int_{-\infty}^{\infty}p(\mathbb{x})\log_2\left(p(\mathbb{x})\right)d\mathbb{x}.~~~~~~\text{Eq. 1}$$ Further, I know that $$p(\mathbb{x})=\frac{1}{\pi^L \det(\sigma^2\mathbb{I})}\exp(-\frac{\mathbb{\|x\|^2}}{\sigma^2}).~~~~~\text{Eq. 2}$$ When I put Eq. 2 into Eq. 1 I get $$H(\mathbb{x})=\frac{1}{\sigma^2 \pi^L \ln(2)}\left[\int_{-\infty}^{\infty}\ln(\pi^L \sigma^2)\exp(-\frac{\|\mathbb{x}\|^2}{\sigma^2})d\mathbb{x}+\frac{1}{\sigma^2 }\int_{-\infty}^{\infty}\|\mathbb{x}\|^2\exp(-\frac{\|\mathbb{x}\|^2}{\sigma^2})d\mathbb{x}\right].$$ How to proceed further to achieve $H(\mathbb{x})=\log_2\det(\pi e \sigma^2)$. Any help in this regard will be much appreciated. Thanks in advance.
How to find the entropy of the circularly symmetric complex gaussian vector?
1.5k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
I am not sure if you have already gotten the answer. I am just posting my method. Hope it helps!
- First let us derive the entropy for a real random vector $\mathbf{Z}\in \mathbb{R}^{L\times1}$ which has multivariate gaussian pdf of mean $0$ and a generic covariance matrix $\mathbf{\Sigma}$ :
$\rightarrow$ Note that the pdf in this case ($\because \mu=0$) is given by :
$$f_{\mathbf{Z}}\left(z_{1}, \ldots, z_{L}\right)=\frac{\exp \left(-\frac{1}{2}\mathbf{z}^T\boldsymbol{\Sigma}^{-1}\mathbf{z}\right)}{\sqrt{(2 \pi)^{L}det(\boldsymbol{\Sigma})}}$$
$\rightarrow$ The differential entropy is given by :
$ \begin{align} H(\mathbf{Z})&=-\int_{-\infty}^{\infty} f_\mathbf{Z}(\mathbf{z}) \log _{2}(f_\mathbf{Z}(\mathbf{z})) d \mathbf{z}\\ &= \frac{1}{2} \log _{2}\left((2 \pi)^{L} det(\boldsymbol{\Sigma})\right)\int_{-\infty}^{\infty} f_\mathbf{Z}(\mathbf{z}) d \mathbf{z}-\frac{1}{2} \log _{2}(e)\int_{-\infty}^{\infty} f_\mathbf{Z}(\mathbf{z})[\mathbf{z^T\Sigma^{-1} z}] d \mathbf{z}\\ &=\frac{1}{2} \log _{2}\left(det(\left(2 \pi \mathbf{\Sigma}\right)\right)+\frac{1}{2} \log _{2}(e) \mathbb{E}\left[\mathbf{z}^T \mathbf{\Sigma}^{-1} \mathbf{z}\right] \\ &=\frac{1}{2} \log _{2}\left(det(\left(2 \pi \mathbf{\Sigma}\right)\right)+\frac{1}{2} \log _{2}(e)\mathbb{E}\left[tr\left(\mathbf{z}^T \mathbf{\Sigma}^{-1} \mathbf{z}\right)\right]\\ \text{[Using Trace Trick]}\\ &=\frac{1}{2} \log _{2}\left(det(\left(2 \pi \mathbf{\Sigma}\right)\right)+\frac{1}{2} \log _{2}(e)\mathbb{E}\left[tr\left(\mathbf{\Sigma}^{-1}\mathbf{z}^T \mathbf{z}\right)\right]\\ &=\frac{1}{2} \log _{2}\left(det(\left(2 \pi \mathbf{\Sigma}\right)\right)+\frac{1}{2} \log _{2}(e)\cdot tr\left(\mathbf{\Sigma}^{-1}\mathbb{E}\left[\mathbf{z}^T \mathbf{z}\right]\right)\\ &=\frac{1}{2} \log _{2}\left(det(\left(2 \pi \mathbf{\Sigma}\right)\right)+\frac{1}{2} \log _{2}(e)\cdot tr\left(\mathbf{\Sigma}^{-1}\mathbf{\Sigma}\right)\\ &=\frac{1}{2} \log _{2}\left(det(\left(2 \pi \mathbf{\Sigma}\right)\right)+\frac{L}{2} \log _{2}(e)\\ &=\frac{1}{2} \log _{2}\left(det(\left(2 \pi e \mathbf{\Sigma}\right)\right)\\ \end{align} $
- Now as @Stelios pointed out, we can write the following (where $\mathbf{X}$ is a complex random vector) :
\begin{aligned} H(\mathbf{X}) &=H(\Re(\mathbf{X}), \mathfrak{I}(\mathbf{X})) \\ &=H(\Re(\mathbf{X}))+H(\mathfrak{I}(X)) \\ &=\frac{1}{2} \log\left(det\left(2 \pi e \frac{\mathbf{\Sigma}}{2}\right)\right)+\frac{1}{2} \log\left(det\left(2 \pi e \frac{\mathbf{\Sigma}}{2}\right)\right) \\ &= \log\left(det\left(\pi e \mathbf{\Sigma}\right)\right) \end{aligned}
- For your specific case, where covariance matrix is $\sigma^2\mathbf{I}$, we get the required entropy expression by substituting this :
$$H(\mathbf{X}) = \log\left(det\left(\pi e \sigma^2\mathbf{I}\right)\right)$$
Formally, the entropy $H(X)$ of a complex random variable $X$ is defined as the entropy $H(\Re(X),\Im(X))$ of the (vector) random variable $[\Re(X),\Im(X)]$, consisting of the real and imaginary components of $X$. (This is in accordance to how the pdf of a complex variable is defined.) Now, for the case of $X$ being circularly symmetric Gaussian of zero mean and covariance $\sigma^2 \mathbf{I}$, its real and imaginary components are i.i.d., Gaussian of zero mean and variance $(\sigma^2/2)\mathbf{I}$. Therefore,
$$ \begin{align} H(X) &= H(\Re(X),\Im(X)) \\ &= H(\Re(X)) + H(\Im(X))\\ &= \frac{1}{2} \log \det \left(2\pi e \frac{\sigma^2}{2} \mathbf{I} \right) + \frac{1}{2} \log \det \left(2\pi e \frac{\sigma^2}{2} \mathbf{I} \right)\\ &= \log \det \left(\pi e \sigma^2 \mathbf{I} \right) \end{align} $$