I bump onto a question about a result stated in Casella & Berger's book Statistical Inference:
Let $X, Y$ be random variables with joint pdf $f_{XY}$ and marginal pdf $f_X$ and $f_Y$. If there exist functions $g$ and $h$ such that $f_{XY}(x,y) = g(x) h(y)$ for all $x,y \in \mathbb{R}$, then $X, Y$ are independent.
This theorem has an multivariate generalization (see Theorem 4.6.11 in Casella) by saying that if the joint pdf can be factorized to $f_{X_1, X_2, \dots, X_n}(x_1, \dots, x_n) = g_1(x_1) g_2(x_2) \cdots g_n(x_n)$, then $X_1, \dots, X_n$ are independent. $(*)$
Then, take Example 2.72 in Hogg's book Introduction to Mathematical Statistics (8th). Consider $X_1, X_2, X_3$ be iid with common pdf $f(x)= e^{-x}$ for $x>0.$ Define transformations $$Y_1 = \frac{X_1}{\sum_{i=1}^3 X_i}, \; Y_2 = \frac{X_2}{\sum_{i=1}^3 X_i}, \; Y_3 = \sum_{i=1}^3 X_i. $$ These transformations map onto the set $$ T=\{(y_1, y_2, y_3): 0<y_1, 0<y_2, 0<1-y_1-y_2, 0< y_3\} $$ One can find the joint pdf of the transformation $(Y_1, Y_2, Y_3)$, call it $f_{Y_1, Y_2, Y_3}(y_1, y_2,y_3)$, as $$f_{Y_1, Y_2, Y_3}(y_1, y_2,y_3)= \underbrace{y_3^2 e^{-y3}}_{:=g(y_3)}$$
Then, it seems to be factorized automatically as $f_{Y_1, Y_2, Y_3}(y_1, y_2,y_3) = g(y_3) \cdot 1 \cdot 1$. According to $(*)$ above, $Y_1, Y_2, Y_3$ are independent.
But this is clearly not the case. If one calculates the marginals, one sees that $$ f_{Y_1,Y_2,Y_3} \neq f_{Y_1} f_{Y_2} f_{Y_3} $$
I think I must misunderstood some parts of Theorem 4.6.11 in Casella. Could someone help to point out my logic flaws? Thanks.