Say I have a likelihood function $p\left(\left.y\right|\theta\right)$ and I make the reparameterization $\varphi=h\left(\theta\right)$ using the bijective function $h$ with inverse $h^{-1}$.
Then it must be that:
$p\left(\left.y\right|\theta\right)=p\left(\left.y\right|h^{-1}\left(\varphi\right)\right)$
But is it also true that:
$p\left(\left.y\right|h^{-1}\left(\varphi\right)\right)=p\left(\left.y\right|\varphi\right)$
or is the statement completely nonsense?
If $h$ is a bijection, then conditioning on $\phi = h(\theta)$ or $\theta$ gives you the same conditional distribution. That is, $p(y|\theta) = p(y|h(\theta)) = p (y|\phi) = p(y| h^{-1}(\phi))$.
If you want to prove this, just go to the definition of conditional distributions.