I'm trying to prove or falsify Proposition 1.8 in Shao, Mathematical Statistics:
It suffices that we show for any $A\in \mathbb{B}^k$ and $i=1,2,...m$
$\int_{A\cap g(A_{i})} f_{Y}(y) dy=\int_{g^{-1}(A)\cap A_{i}}(f_{Y}\circ g)(x)\cdot\frac{dy}{dx}(x)dx $
where $\frac{dy}{dx}$exist a.e.
But I don't know how to check this. And if this is true,the rigorous proof might involve change of variable formulas for multivariable in the form of measure, e.g. see Rudin 7.26. If false, we need a counter example restricted in probability case. But my attempt both fails. Thanks for answering my question!

For $x \in \mathbb{R}^k$, let $Jg(x) = |\det Dg(x) |$. I believe all we need to get the transformation formula $$f_Y(y) = \sum_{x \in g^{-1}(\{y\})}f_X(x)Jg(x)^{-1}$$ is that $g$ is $C^1$ and $Jg(x) > 0$ a.e..
Here is the computation I want to carry out to prove it: For any Borel set $A \subset \mathbb{R}^k$, \begin{align} P(Y \in A) &= P(X \in g^{-1}(A)) \\ &= \int_{g^{-1}(A)}f_X(x)\,dx \\ &= \int_{\mathbb{R}^k}f_X(x)1_{A}(g(x))\,dx \\ &= \int_{\mathbb{R}^k}f_X(x)1_{A}(g(x))Jg(x)^{-1}Jg(x)\,dx \\ &= \int_{\mathbb{R}^k}\sum_{x \in g^{-1}(\{y\})}f_X(x)1_{A}(g(x))Jg(x)^{-1}\,dy \\ &= \int_{A}\sum_{x \in g^{-1}(\{y\})}f_X(x)Jg(x)^{-1}\,dy. \end{align} From this we get $$f_Y(y) = \sum_{x \in g^{-1}(\{y\})}f_X(x)Jg(x)^{-1}.$$ Now to justify the equalities: The fourth equality is valid as long as $Jg > 0$ a.e., i.e. as long as $Dg(x)$ is invertible for a.e. $x \in \mathbb{R}^k$. The 5th equality uses the change of variables formula in the form of Theorem 3.9 from "Measure Theory and Fine Properties of Functions" by Evans and Gariepy. So the 5th equality is valid if $g$ is Lipschitz, and I think it should also be valid if $g$ is $C^1$.