Reversing results for sums of independent variables

88 Views Asked by At

Please let me use a specific example to illustrate the general title above.

(1) It is well known that if $X$ and $Y$ are independent and $X,Y\sim N(0,1)$ then $$ Z\equiv X^2+Y^2\sim\chi_2^2 $$ where $\chi_n^2$ denotes the $\chi^2$ distribution with $n\in\mathbb{R}_{++}$ degrees of freedom which is defined via its PDF (or, equivalently, CDF, MGF, etc.) without any mentioning of the normal distributions.

I'd like to say something like:

(2) If $Z\sim\chi_2^2$, then $\exists X,Y$ independent, $X,Y\sim N(0,1)$ such that $Z=X^2+Y^2$.

Are there some general techniques that lead from (1) to (2)? If not, are there some specific techniques that are applicable for this particular case?

Thank you.

1

There are 1 best solutions below

1
On BEST ANSWER

There are two answers depending on how restrictive your setting is. As the OP asks for a general principle I will consider the following easier example:

Let $\Omega =\{0,1,2\}$, $P = (\delta_0+2\delta_1+\delta_2)/4$ and $Z:\Omega \to \mathbb{R}$ with $Z(\omega ) = \omega$. We know that we may obtain $Z \stackrel{d}{=} Z'$ with $Z' = X'+Y'$ where $X'$ and $Y'$ uniform on $\{0,1\}$ ($\stackrel{d}{=}$ for equal in distribution).

Restrictive setting: Do there exist $X$ and $Y$ on the same probability space as $Z$ such that $Z= X+Y$ almost surely and $X$ and $Y$ uniform on $\{0,1\}$ and independent of each other?

Answer: no. Quick: you cannot construct $X$ and $Y$ on the probability space with the correct probabilities and independence.

More elaborate: Suppose $X$ and $Y$ exist. Then $X: \Omega \to \mathbb{R}$ is measurable, so $X^{-1}(\{0\})=A$ and $X^{-1}(\{0\})=\Omega \backslash A$ for a certain $A \subset \Omega$. Moreover, to have $P \circ X^{-1}(\{0\})= 1/2$ we need to have $A= A_1=\{0,2\}$ or $A=A_2=\{1\}$. Then we (need to) define $Y= Z-X$. In the case $A=A_1$ this leads to $P(Y=2) = P(Z=2,X=0) = P(Z=2) = 1/4$, which means that $Y$ does not have uniform distribution on $\{0,1\}$. Furthermore $Y$ and $X$ won't be independent. If we have $A=A_2$ then we find that $P(Y=0)$ with the same problems. So to subsume, one cannot construct $X$ and $Y$ on the same probability space.

Less restrictive setting: Do there exist $X$ and $Y$ on an extension of the probability space as $Z$, which are independent and such that $Z= X+Y$ almost surely?

Answer: yes. Quick: define $X$ via regular conditional probabilities and set $Y:=Z-X$.

More elaborate: Let $\kappa(z,dx) = P(X' \in dx | X'+Y'=z)$ be a regular conditional probability for two independent, uniform $\{0,1\}$ random variables $X'$ and $Y'$. Use Lemma 3.22 in Kallenberg, Foundations of Modern Probability (this is for a generalization), to construct $X$ on a probability space $(\Omega \times [0,1],\mathcal{F}_1,P_1)$ with $P_1 (Z\in dz, X \in dx) = P(Z \in dz) \kappa(z,dx)$. Define $Y := Z-X$ and show that $X$ and $Y$ are independent and have uniform law on $\{0,1\}$. In our particular case, we can construct Kallenberg's function $f: S=\{0,1,2\}\times [0,1] \to T = \{0,1\}$ explicitely ($S$ is the set where $Z$ takes values and $T$ is the set where $X$ takes values): $f(0,\vartheta)=0$, $f(1,\vartheta)= \mathbb{1}_{(\vartheta \geq 1/2)}$ and $f(2,\vartheta)=1$. Then the pair $(Z,f(Z,\vartheta))$ has law $P(Z \in dz) \kappa(z,dx)$ by the aforementioned lemma.