Find the distribution of $\xi$

244 Views Asked by At

Assume $X_1,\ldots,X_n$ are independent random variables, where $X_i\sim N(0,\sigma_i^2),i=1,2,\cdots,n$. Define $$Z=\frac{\sum\limits_{i=1}^{n}\frac{X_i}{\sigma_i^2}}{\sum\limits_{i=1}^{n}\frac{1}{\sigma_i^2}},$$ and $$\xi=\sum\limits_{i=1}^{n}\frac{(X_i-Z)^2}{\sigma_i^2}.$$ Find the distribution of $\xi$.

I tried to normalize the random variables but obtained nothing useful, and I have no idea where is the correct way to get the answer.

Can you give me a hint? Thank you!

2

There are 2 best solutions below

3
On BEST ANSWER

I apologize in advance that I will present the solution not using Cochran’s theorem, which I absolutely don’t understand, but as it is usually done in one separate local world around me. I provide some general facts first.

If $Y_1,\ldots,Y_n$ are independent standard normal random variables and $Q$ is orthogonal $n\times n$ matrix then random vector $\mathbf V=Q\mathbf Y$ consists of independent standard normal random variables too.

Looks these answers for the proofs.

Multiplying a vector by an orthogonal matrix does not change the euclidean norm of a vector: if $\mathbf V=Q\mathbf Y$ with orthogonal matrix $Q$ then $\sum_{i=1}^n V_i^2 = \sum_{i=1}^n Y_i^2$.

Proof (too short for long search a suitable link) $$ \sum_{i=1}^n V_i^2 = \mathbf V^T \mathbf V = \mathbf Y^T \underbrace{Q^T Q}_{I_n} \mathbf Y = \mathbf Y^T\mathbf Y = \sum_{i=1}^n Y_i^2. $$

Let $Y_1,\ldots,Y_n$ be independent standard normal random variables and let $Q$ be orthogonal $n\times n$ matrix. If $\mathbf V=Q\mathbf Y$ then $$ \sum_{i=1}^n Y_i^2 - V_1^2 \sim \chi^2_{n-1}. $$

Indeed, since the sums of squares coincide, replace $\sum_{i=1}^n Y_i^2$ by $\sum_{i=1}^n V_i^2$: $$ \sum_{i=1}^n Y_i^2 - V_1^2 = \sum_{i=1}^n V_i^2 - V_1^2 = \sum_{i=2}^n V_i^2 \sim \chi^2_{n-1}. $$ The last follows from the fact that $V_2,\ldots,V_n$ are independent standard normal.

We can then prove that $\xi=\sum\limits_{i=1}^{n}\frac{(X_i-Z)^2}{\sigma_i^2}\sim \chi^2_{n-1}$.

Denote by $b = \sum_{i=1}^n \frac{1}{\sigma_i^2}$.

$$ \xi=\sum_{i=1}^{n}\frac{(X_i-Z)^2}{\sigma_i^2} = \sum_{i=1}^{n}\frac{X_i^2}{\sigma_i^2} - 2Z\underbrace{\sum_{i=1}^{n}\frac{X_i}{\sigma_i^2}}_{Zb}+Z^2\underbrace{\sum_{i=1}^{n}\frac{1}{\sigma_i^2}}_b = \sum_{i=1}^{n}\left(\frac{X_i}{\sigma_i}\right)^2 -bZ^2. $$

First note that $Y_i=\frac{X_i}{\sigma_i}\sim \mathcal N(0,1)$ are independent standard normal. If we will show that there exists an orthogonal matrix $Q$ such that $bZ^2=V_1^2$ where $\mathbf V=Q\mathbf Y$, we are done.

Look at $$ bZ^2= (\sqrt{b}Z)^2 = \left(\sum_{i=1}^n \frac{X_i}{\sqrt{b}\sigma_i^2}\right)^2 = \left(\sum_{i=1}^n \frac{1}{\sqrt{b}\sigma_i}Y_i\right)^2. $$

Consider a square matrix with the first row $\left(\frac{1}{\sqrt{b}\sigma_1},\ldots, \frac{1}{\sqrt{b}\sigma_n}\right)$. This vector has unit length: $$ \sum_{i=1}^n \left(\frac{1}{\sqrt{b}\sigma_i}\right)^2 = \frac{1}{b} \cdot \sum_{i=1}^n \frac{1}{\sigma_i^2} = \frac{1}{b} \cdot b = 1. $$ Orthogonal matrix is a square matrix whose rows are orthogonal unit vectors. We can add $n-1$ orthogonal unit vectors to this one to form a matrix. Therefore there exists orthogonal matrix $Q$ with this first row. Note then that $\mathbf V=Q\mathbf Y$ has first coordinate exactly $$ V_1 = \sum_{i=1}^n \frac{1}{\sqrt{b}\sigma_i}Y_i $$ and then $$ \xi=\sum_{i=1}^{n}Y_i^2 -V_1^2 \sim \chi^2_{n-1}. $$

1
On

$Z$ is just a weighted sample mean with weights $w_i=\frac1{\sigma_i^2}$, so $\xi$ can be seen as a weighted sample variance. You can find the distribution of $\xi$ using some standard change of variables.

As $X_1,\ldots,X_n$ are independent normal, the vector $X=(X_1,,\ldots,X_n)^T$ has an $n$-variate normal $N_n(0,\Sigma)$ distribution with $\Sigma=\operatorname{diag}\left(\frac1{w_1},\ldots,\frac1{w_n}\right)$.

Observe that

\begin{align} \xi&=\sum_{i=1}^n w_i(X_i-Z)^2 \\&=\sum_{i=1}^n w_iX_i^2-\left(\sum_{i=1}^n w_i\right)Z^2 \\&=\sum_{i=1}^n U_i^2-\frac{\left(\sum\limits_{i=1}^n \sqrt{w_i} U_i\right)^2}{\sum_{i=1}^n w_i}\qquad,\ U_i=\sqrt w_i X_i \end{align}

So change variables $X\mapsto U=(U_1,\ldots,U_n)^T$ such that $U=AX$ where $$A=\operatorname{diag}(\sqrt{w_1},\ldots,\sqrt{w_n})$$

This gives the distribution of $U$ as $U\sim N_n(0,A\Sigma A^T)$, i.e., $U\sim N_n(0,I_n)$.

Now if $V_i$'s be such that $V_1=\sum\limits_{i=1}^n \sqrt{\frac{w_i}{\sum_{i=1}^n w_i}}U_i$ and $\sum\limits_{i=1}^n U_i^2=\sum\limits_{i=1}^n V_i^2$, then

\begin{align} \xi&=\sum_{i=1}^n U_i^2-\frac{\left(\sum\limits_{i=1}^n \sqrt{w_i} U_i\right)^2}{\sum_{i=1}^n w_i} \\&=\sum_{i=1}^n V_i^2-\left(\sum_{i=1}^n \sqrt{\frac{w_i}{\sum_{i=1}^n w_i}}U_i\right)^2 \\&=\sum_{i=1}^n V_i^2-V_1^2=\sum_{i=2}^n V_i^2 \end{align}

Here an orthogonal transformation comes in handy.

Transform $U\mapsto V=(V_1,\ldots,V_n)^T$ such that $V=PU$ where $P$ is an orthogonal matrix with first row fixed as $$\left(\sqrt{\frac{w_1}{\sum_{i=1}^n w_i}},\sqrt{\frac{w_2}{\sum_{i=1}^n w_i}},\ldots,\sqrt{\frac{w_n}{\sum_{i=1}^n w_n}}\right)$$

The resulting distribution remains unchanged under an orthogonal transformation, i.e., $V\sim N_n(0,I_n)$.

You can write down all the densities explicitly and calculate the jacobians to reach the same conclusion.