I am following a Casella and Berger's proof but I have a difficulties to understand one particular step.
Assuming that $X_1,...,X_n$ is a random sample (i.i.d.) drawn from a $N(0,1)$ distribution, they estimate the joint distribution of the following transformation:
$y_1=\overline{x}$
$y_2=x_2-\overline{x}$
...
$y_n=x_n-\overline{x}$
I know that in order to obtain the pdf of the transformation we have to calculate the inverse of the transformation, calculate the determinant of the Jacobian, etc.
What I obtain after doing all the procedure is that the joint probability distribution of the transformation is:
$$\frac{n}{{(2\pi)}^{\frac{n}2}}e^{-1/2(y_1-\frac{1}n\sum_2^n(y_i+y_1))^2}e^{-1/2(\sum_2^n(y_i+y_1))^2}$$
since $y_1=\frac{1}n\sum_1^nx_i$ we have that $x_1=y_1-\frac{1}n\sum_2^n(y_i+y_1)$ which is equal to $x_1=\frac{1}n(y_1-\sum_2^ny_i)$
and hence the final expression would be: $$\frac{n}{{(2\pi)}^{\frac{n}2}}e^{-1/2(\frac{1}n(y_1-\sum_2^ny_i))^2}e^{-1/2(\sum_2^n(y_i+y_1))^2}$$
Well, actually if you see the theorem 5.3.1 from Casella and Berger's "Statistical Inference" they write:
$$\frac{n}{{(2\pi)}^{\frac{n}2}}e^{-1/2((y_1-\sum_2^ny_i))^2}e^{-1/2(\sum_2^n(y_i+y_1))^2}$$ (without the $\frac{1}n$ on the first exponential)
Is that a typo or did I make some mistake?
Your error is in the assertion: $x_1=y_1-\frac{1}n\sum_2^n(y_i+y_1)$. Since $y_1=\frac1n\sum_1^n x_i$, this means: $$\sum x_i = ny_1$$ and so $$x_1 = ny_1 - \sum_2^n x_i=ny_1-\sum_2^n (y_i + y_1)=y_1 - \sum_2^n y_i.$$ Your calculation introduced a factor of $\frac1n$, or it dropped the factor of $n$.