I want to know how multivariate differential entropy depends on scaling one of its dimensions. Namely, I want to know if there is a way to simplify the expression
$$H(\alpha X, Y, Z, ...)$$
as a function of a positive constant $\alpha$ for an arbitrary multivariate probability distribution. I am mostly interested in the 2D case, but a general $n$-dimensional case would also be great to understand. According to wiki, the result for 1D is
$$H(\alpha X) = H(X) + \log(|\alpha|)$$
and for arbitrary matrix multiplication is
$$H(A X) = H(X) + \log(|\det A|)$$
with a link to Cover and Thomas page 253. However, Cover and Thomas do not give proof for the second equation. If possible, I would appreciate a proof of the general case.