$$I(X;Y) = -\frac{1}{2} \ln(1-\rho^2)$$ is the mutual information between two Gaussian random variables. What source derived this formula? Could we have the full derivation here as an answer.
First Attempt
Given $f(x)$ is the Gaussian p.d.f. of variable $X$ and $f(y)$ is the Gaussian p.d.f. of variable $Y$, and \begin{align} f(x,y)&=\frac{1}{\left( (2\pi)^{n}\det{(\boldsymbol \Sigma)}\right)^\frac{1}{2} }\exp\left(-\frac{1}{2}(\boldsymbol{x}-\boldsymbol{\mu})^\top\boldsymbol{\Sigma}^{-1}(\boldsymbol{x}-\boldsymbol{\mu})% \right)\\ & = \frac{1}{2 \pi \sigma_X \sigma_Y \sqrt{1-\rho^2}} \exp \left\{ -\frac{1}{2\left(1-\rho^2\right)} \left[ \left(\frac{x-\mu_{X}}{\sigma_{X}}\right)^2 + \left(\frac{y-\mu_{Y}}{\sigma_{Y}}\right)^2 -2\rho \left(\frac{x-\mu_{X}}{\sigma_{X}}\right) \left(\frac{y-\mu_{X}}{\sigma_{Y}}\right) \right] \right\} \end{align} is the joint distribution where $\boldsymbol{x}, \boldsymbol{\mu}$ and $\boldsymbol{\Sigma}$ are the data observations, means and covariance matrix of the joint distribution,
\begin{align} I(X;Y) &= \int \int f(x,y) \ln \frac{f(x,y)}{f(x)f(y)} dx dy\\ &= \int \int f(x,y) \ln \frac{f(x,y)}{\left(2\pi \sigma_X^2\right)^{-\frac{1}{2}} e^{-(x-\mu_X)^2 / 2\sigma_X^2} \cdot \left(2\pi \sigma_Y^2\right)^{-\frac{1}{2}} e^{-(y-\mu_Y)^2 / 2\sigma_Y^2}} dx dy\\ &= ? \end{align}
Let $(X, Y) \sim \mathcal{N}(0, K),$ where $$ K=\left[\begin{array}{cc} \sigma^{2} & \rho \sigma^{2} \\ \rho \sigma^{2} & \sigma^{2} \end{array}\right] $$ Then $$h(X)=h(Y)=\frac{1}{2} \log (2 \pi e) \sigma^{2}$$ and $$h(X, Y)=\frac{1}{2} \log (2 \pi e)^{2}|K|= \frac{1}{2} \log (2 \pi e)^{2} \sigma^{4}\left(1-\rho^{2}\right),$$ and therefore $$ I(X ; Y)=h(X)+h(Y)-h(X, Y)=-\frac{1}{2} \log \left(1-\rho^{2}\right) $$ If $\rho=0, X$ and $Y$ are independent and the mutual information is 0 . If $\rho=\pm 1, X$ and $Y$ are perfectly correlated and the mutual information is infinite.