Find the distribution of $X_1^2 + X_2^2$?

1.6k Views Asked by At

Let $X_1$ and $X_2$ are independent $N(0, \sigma^2)$ which means (mean = 0, variance = $\sigma^2$) random variables. What is the distribution of $X_1^2 + X_2^2$?

My approach is that $X_1\sim N(0, \sigma^2)$ and $X_2\sim N(0, \sigma^2)$.

Transforming $X_1$ and $X_2$ into standard normal, $X_1/\sigma\sim N(0, 1)$ and $X_2/\sigma\sim N(0, 1)$.

Then $X_1^2/\sigma$ and $X_2^2/\sigma$ have chi-squared distribution with 1 degree of freedom.

Then I found the moment-generating function for $X_1^2$ and $X_2^2$;$$m_{X_1^2} = (1-2t)^{-1/2}$$ and $$m_{X_2^2} = (1-2t)^{-1/2}$$

So the moment generating function for $X_1^2 + X_2^2$ is $$m_{X_1^2}(t) m_{X_2^2}(t) = (1-2t)^{-2/2}$$

So $X_1^2 + X_2^2$ has a chi-squared distribution with 2 degrees of freedom. My question can I treat $X_1^2/\sigma$ + $X_2^2/\sigma$ as $X_1^2$ + $X_2^2$ like I did above?

5

There are 5 best solutions below

61
On BEST ANSWER

Hint: Recall that if $X_1,X_2,...,X_n$ are independent and identically distributed as $\chi_{1}^{2}\,$, then $$\sum_{i=1}^{n} X_{i} \sim \chi_{n}^{2}$$ and that if $Z \sim N(0,1)\,$, then $$Z^2 \sim \chi_{1}^{2} \,\,.$$ Additional hint/spoiler: By the above, for independent $X,Y \sim \chi_{1}^{2}$ , it follows that $X+Y \sim \chi_2^2 \,.$ It can be shown that $X+Y \equiv W$ for $W \sim \text{Exp}(\frac{1}{2})$. You should verify this and then you are basically finished. To do this, prove that $X \sim \chi_n^2$ has density given by $$f(x \mid n) = \frac{1}{2^{n/2}\Gamma(n/2)}x^{n/2-1}e^{-x/2} \,\,\,\,\text{for $x>0$}\,.$$ Then, see that the density of $X \sim \chi_2^2$ is $$f(x \mid 2) = \frac{1}{2}e^{-\frac{1}{2}x} \,\,\,\,\text{for $x>0$}\,,$$ and this is the density of a random variable from an exponential distribution with parameter $\frac{1}{2}$. $$$$ Another relevant derivation: Suppose $X$ is a random variable and $Z = aX$ for some $a \in \mathbb{R} \backslash \{0\}$. Then the cumulative distribution function of $Z$ is given by $$F_Z(z) =\mathbb{P}(Z \leq z)=\mathbb{P}(aX \leq z)=\mathbb{P}\left(X\leq \frac{z}{a}\right)=F_X\left(\frac{z}{a}\right)\,\,,$$ where $F_X$ is the cumulative distribution function of $X$. Now, we derive the density function of $Z$, which we will denote $f_Z$, in the case that $a>0$ (this is the only case that applies here since $\sigma^2>0$): $$f_Z(z)=\frac{\text{d}}{\text{d}z}F_Z(z)=\frac{\text{d}}{\text{d}z}F_X\left(\frac{z}{a}\right)=\frac{1}{a}f_X\left(\frac{z}{a}\right) \,,$$ where $f_X$ is the density function of $X$. Final hint: The above hints are in the order of usage. $$$$ The following is a solution based on the above hints for future readers' benefit. Notice that we can standardize $X_1$ and $X_2$, so that $$\frac{X_i - 0}{\sigma} = \frac{X_i}{\sigma} \sim N(0,1) \,\,\,\,\,\text{for} \,\,i=1,2\,\,.$$ It follows that $$\left(\frac{X_i}{\sigma}\right)^2 \sim \chi_1^2 \,\,\,\,\,\text{for}\,\,i=1,2\,\,,$$ so that $$\left(\frac{X_1}{\sigma}\right)^2 + \left(\frac{X_2}{\sigma}\right)^2=\frac{1}{\sigma^2}(X_1^2 + X_2^2) \sim \chi_2^2\,\,.$$ Also, it is not difficult to verify that a $\chi_2^2$ random variable is equivalent in distribution to an $\text{Exp}(\frac{1}{2})$ random variable, so $$\frac{1}{\sigma^2}(X_1^2 + X_2^2) \sim \text{Exp}\left(\frac{1}{2}\right)\,.$$ Now, let $X = \frac{1}{\sigma^2}(X_1^2 + X_2^2)$ and $a=\sigma^2>0$, so that $Z=aX=X_1^2 + X_2^2$, and apply the last hint. We know that the density function of $X$ is given by $$f_X(x)=\frac{1}{2}e^{-\frac{1}{2}x},$$ and it follows that the density function of $Z$ is $$f_Z(z)= \frac{1}{\sigma^2}f_X\left(\frac{z}{\sigma^2}\right)=\frac{1}{2\sigma^2}e^{-\frac{1}{2\sigma^2}z} \,\,.$$ We recognize this as the density for a random variable from an exponential distribution with parameter $\frac{1}{2\sigma^2}$. In other words, $$X_1^2 + X_2^2 \sim \text{Exp}\left(\frac{1}{2\sigma^2}\right)\,\,.$$ @DilipSarwate @FelixMarin @user3001408, you might be interested in this derivation.

4
On

Not really - recall that the mgf is $$ m_X(t) = \mathbb{E}\left[e^{tX}\right] $$ and if you rescale $X$ by a constant $\sigma$, what happens to the result?

4
On

You can approach it like this:

1) Calculate the distribution of $X_1^2$ and $X_2^2$, individually. Call them $f(X_1)$, and $g(X_2)$.

2) Now the second step. Now you calculate the distribution of the sum of $f(X_1)$ and $g(X_2)$ via convolution integral.

This can be one way of calculating what you are asking!

Note:

To calculate the distribution of $X_1^2$ (e.g.), you can use the CDF method. For example $F_{X_1^2}(x)=P(X_1^2 \le x)$. Now you express it in terms of $X_1$, and then differentiate to get the PDF.

5
On

$\newcommand{\+}{^{\dagger}} \newcommand{\angles}[1]{\left\langle\, #1 \,\right\rangle} \newcommand{\braces}[1]{\left\lbrace\, #1 \,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\, #1 \,\right\rbrack} \newcommand{\ceil}[1]{\,\left\lceil\, #1 \,\right\rceil\,} \newcommand{\dd}{{\rm d}} \newcommand{\down}{\downarrow} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,{\rm e}^{#1}\,} \newcommand{\fermi}{\,{\rm f}} \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{{\rm i}} \newcommand{\iff}{\Longleftrightarrow} \newcommand{\imp}{\Longrightarrow} \newcommand{\isdiv}{\,\left.\right\vert\,} \newcommand{\ket}[1]{\left\vert #1\right\rangle} \newcommand{\ol}[1]{\overline{#1}} \newcommand{\pars}[1]{\left(\, #1 \,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}} \newcommand{\root}[2][]{\,\sqrt[#1]{\vphantom{\large A}\,#2\,}\,} \newcommand{\sech}{\,{\rm sech}} \newcommand{\sgn}{\,{\rm sgn}} \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert} \newcommand{\wt}[1]{\widetilde{#1}}$ Let's $\ds{X \equiv X_{1}^{2} + X_{2}^{2}}$:

\begin{align} &\color{#00f}{\large\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} {\expo{-X_{1}^{2}/\pars{2\sigma^{2}}} \over \root{2\pi}\sigma} \,{\expo{-X_{2}^{2}/\pars{2\sigma^{2}}} \over \root{2\pi}\sigma} \delta\pars{X - X_{1}^{2} - X_{2}^{2}}\,\dd X_{1}\,\dd X_{2}} \\[3mm]&={1 \over 2\pi\sigma^{2}}\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} \expo{-\pars{X_{1}^{2} + X_{2}^{2}}/\pars{2\sigma^{2}}} \delta\pars{X - X_{1}^{2} - X_{2}^{2}}\,\dd X_{1}\,\dd X_{2} \\[3mm]&=\Theta\pars{X}\,{\expo{-X/\pars{2\sigma^{2}}} \over 2\pi\sigma^{2}}\times \\[3mm]&\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} \!\!\!\!\!\Theta\pars{\root{X} - \verts{X_{1}}}\bracks{% {\delta\pars{X_{2} + \root{X - X_{1}^{2}}} \over 2\verts{X_{2}}} +{\delta\pars{X_{2} - \root{X - X_{1}^{2}}} \over 2\verts{X_{2}}}} \,\dd X_{1}\,\dd X_{2} \\[3mm]&=\Theta\pars{X}\,{\expo{-X/\pars{2\sigma^{2}}} \over 2\pi\sigma^{2}} \int_{-\root{X}}^{\root{X}}{\dd X_{1} \over \root{X - X_{1}^{2}}} =\Theta\pars{X}\, {\expo{-X/\pars{2\sigma^{2}}} \over 2\pi\sigma^{2}}\bracks{2\arcsin\pars{1}} \\[3mm]&=\color{#00f}{\large\Theta\pars{X}\, {\expo{-X/\pars{2\sigma^{2}}} \over 2\sigma^{2}}} \end{align}

$\ds{\Theta\pars{x}}$ is the Heaviside Step Function. $\ds{\delta\pars{x}}$ is the Dirac Delta Function.

6
On

You have $$ X_{i}^{2}=\sigma^{2}(Z^{2})=\sigma^{2}\Gamma(\frac{1}{2},2)=\Gamma(\frac{1}{2},2\sigma^{2}) $$

Therefore we have $$ X_{1}^{2}+X_{2}^{2}=\Gamma(1,2\sigma^{2}) $$

where we used property of $\Gamma$-distribution.