Differential entropy of the multivariate student-t distribution

880 Views Asked by At

The differential entropy of the multivariate student-t distribution when the covariance matrix is the identity matrix is given by $$ h = - \log \frac{ \Gamma \left( \frac{\nu+d}{2} \right) } { \Gamma \left( \frac{\nu}{2} \right) (\nu \pi)^{\frac{d}{2}} } + \left( \frac{\nu+d}{2} \right) \left( \psi \left( \frac{\nu+d}{2} \right) - \psi \left( \frac{\nu}{2} \right) \right) $$ (source: Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distribution by Arellano-Valle).

What about the general case ($\boldsymbol{\Sigma} \ne \boldsymbol{I}$)?

1

There are 1 best solutions below

0
On

To get the differential entropy in the general case, we draw on two properties:

  • If $\boldsymbol{x}$ is a standard Student-t random vector, then $\boldsymbol{y} = \boldsymbol{\mu} + \boldsymbol{L} \boldsymbol{x}$ is a Student-t random vector with mean $\boldsymbol{\mu}$ and covariance matrix $\boldsymbol{\Sigma} = \boldsymbol{L}\boldsymbol{L}'$.
  • It is known that for a vector valued random variable $\boldsymbol{x}$ and a matrix $\boldsymbol{A}$ we have $h(\boldsymbol{A} \boldsymbol{x}) = h(\boldsymbol{x}) + \log |\boldsymbol{A}|$ (see https://en.wikipedia.org/wiki/Differential_entropy#Properties_of_differential_entropy).

As a result, we can write for a Student-t random vector $\boldsymbol{x}$ with mean $\boldsymbol{\mu}$ and covariance matrix $\boldsymbol{\Sigma}$ $$ h(\boldsymbol{x}) = h_{\boldsymbol{\Sigma}=\boldsymbol{I}} + \frac{1}{2} \log | \boldsymbol{\Sigma} | $$ where $$ h_{\boldsymbol{\Sigma}=\boldsymbol{I}} = - \log \frac{ \Gamma \left( \frac{\nu+d}{2} \right) } { \Gamma \left( \frac{\nu}{2} \right) (\nu \pi)^{\frac{d}{2}} } + \left( \frac{\nu+d}{2} \right) \left( \psi \left( \frac{\nu+d}{2} \right) - \psi \left( \frac{\nu}{2} \right) \right) $$