The differential entropy of the multivariate student-t distribution when the covariance matrix is the identity matrix is given by $$ h = - \log \frac{ \Gamma \left( \frac{\nu+d}{2} \right) } { \Gamma \left( \frac{\nu}{2} \right) (\nu \pi)^{\frac{d}{2}} } + \left( \frac{\nu+d}{2} \right) \left( \psi \left( \frac{\nu+d}{2} \right) - \psi \left( \frac{\nu}{2} \right) \right) $$ (source: Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distribution by Arellano-Valle).
What about the general case ($\boldsymbol{\Sigma} \ne \boldsymbol{I}$)?
To get the differential entropy in the general case, we draw on two properties:
As a result, we can write for a Student-t random vector $\boldsymbol{x}$ with mean $\boldsymbol{\mu}$ and covariance matrix $\boldsymbol{\Sigma}$ $$ h(\boldsymbol{x}) = h_{\boldsymbol{\Sigma}=\boldsymbol{I}} + \frac{1}{2} \log | \boldsymbol{\Sigma} | $$ where $$ h_{\boldsymbol{\Sigma}=\boldsymbol{I}} = - \log \frac{ \Gamma \left( \frac{\nu+d}{2} \right) } { \Gamma \left( \frac{\nu}{2} \right) (\nu \pi)^{\frac{d}{2}} } + \left( \frac{\nu+d}{2} \right) \left( \psi \left( \frac{\nu+d}{2} \right) - \psi \left( \frac{\nu}{2} \right) \right) $$