I am self-studying Statistics and trying to prove the following theorem:
If $X_1,X_2$ are independent and $X_1$ has Chi-squared distribution with $n_1$ degrees of freedom and $X = X_1+X_2$ also follow Chi-squared distribution with $n (>n_1)$ degrees of freedom. Then $X_2$ follows Chi-squared distribution with $n-n_1$ degrees of freedom.
I tried to prove this using $X_2 = X-X_1$ and use the moment generating function approach but I get stuck because $X$ and $X_1$ are not independent random variables so I can't separate the two MGFs. Please drop a hint on how to proceed.
Thanks
It is probably better practice to use characteristic functions rather than moment generating functions, but the argument here is similar.
Let the characteristic functions be $\phi_{X_1}(s)$, $\phi_{X_2}(s)$ and $\phi_{X}(s)$. Since $X_1$ and $X_2$ are independent and $X=X_1+X_2$, we have
$$\phi_{X}(s) = \phi_{X_1}(s)\,\phi_{X_2}(s)$$
but the characteristic function of a chi-square distribution with $n$ degrees of freedom is $(1-2is)^{n/2}$
$$\phi_{X_2}(s)= \frac{\phi_{X}(s)}{ \phi_{X_1}(s)} = \frac{(1-2is)^{n/2}}{(1-2is)^{n_1/2}} = (1-2is)^{(n-n_1)/2}$$ which is the required form.
This is not a special result for chi-square distributions. Essentially the same argument applies wherever $X$ is the sum of $n$ iid random variables and $X_1$ is the sum of $n_1$ independent random variables with the same distribution and $X_2=X-X_1$ is independent of $X_1$
Independence of $X_1$ and $X_2$ matters - otherwise it is not true.
If $F_n(x)$ is the cumulative distribution function of a chi-square distribution with $n$ degrees of freedom, then consider
$$X = F^{-1}_n\left(F_{n_1}(X_1)\right)$$
which has a chi-square distribution with $n$ degrees of freedom but $$X_2 = X-X_1=F^{-1}_n\left(F_{n_1}(X_1)\right)-X_1$$ does not have a chi-square distribution. It has mean $n-n_1$ but a variance much below $2(n-n_1)$