I am reading the following paper, in which a random variable transformation takes place, under an entropy term:
Since $y=g(x;\theta)$ and $g$ is invertible, the probability density of $y$ should be:
$$p_y(y)=p_x(g^{-1}(y))\left|\dfrac{dx}{dy}\right|$$
Then I calculate the entropy of $y$ as:
$$H(y) = \mathbb{E}\left[ \log p_x(g^{-1}(y))\right] + \mathbb{E}[ \log \left| \dfrac{dx}{dy} \right| ]$$
But in the paper, the matrix $J$ is defined as the Jacobian of $y$ with respect to $x$. Moreover the $const.$ term does not make sense to me. Am I missing something here or is something wrong with the paper's approach?

You are missing a minus sign in you expression for the entropy of $y$. It should be $$ \begin{align} \mathcal{H}(y) &= \mathbb{E}\left[ \log\left(\frac{1}{p(y)}\right) \right]\\ &= -\int p(y) \log\left(p(y)\right) \mathrm{d}y \\ &= -\int p(x) \log\left(\left|\frac{\mathrm{d}x}{\mathrm{d}y}\right|p(x)\right) \mathrm{d}x\\ &= \int p(x) \log\left(\frac{1}{p(x)}\right) \mathrm{d}x +\int p(x) \log\left(\left|\frac{\mathrm{d}y}{\mathrm{d}x}\right|\right) \mathrm{d}x\\ &= \mathcal{H}(x) + \mathbb{E}\left[\left|\frac{\mathrm{d}y}{\mathrm{d}x}\right|\right] \end{align} $$ In that paricular paper (can you please put a link to it?), the entropy of $x$ is constant with respect to the parameters $\mathbf{\Theta}$