Independent random variables: Sum

204 Views Asked by At

Suppose that for $n\in\mathbb{N}$, $(Y_1,\ldots ,Y_{n+1})$ is a finite collection of independent random variables, does that imply that $Y_{n+1}$ is independent of $Y_{1} +\cdots+Y_{n}$? And if so, how does one prove this RIGOROUSLY. Thanks in advance!

2

There are 2 best solutions below

4
On

Take $S_k=\{(k_1,\ldots, k_n)\in\mathbb N^n:k_1+\ldots+k_n=k\}$

Then $$\mathbb P(Y_1+\ldots+Y_n =k, Y_{n+1}=l)=\sum_{(k_1,\ldots, k_n)\in S_k}\mathbb P(Y_1=k_1,\ldots, Y_n=k_n, Y_{n+1}=l)$$ $$ = \sum_{(k_1,\ldots, k_n)\in S_k}\mathbb P(Y_1=k_1,\ldots, Y_n=k_n) \mathbb P(Y_{n+1}=l)=\mathbb P(Y_1+\ldots+Y_n =k)\mathbb P( Y_{n+1}=l)$$

and hence they're independent.

0
On

To prove this rigorously, you may need measure-theoretic probability knowledge. You may go as follows:

Let $Y_1, \ldots, Y_{n + 1}$ be random variables defined on the probability space $(\Omega, \mathscr{F}, P)$. By definition,

$Y_1, \ldots, Y_{n + 1}$ are said to be independent if their corresponding $\sigma$-fields $\sigma(Y_1), \ldots, \sigma(Y_{n + 1})$ are independent.

Some explanations to terminologies:

  • For any random variable $X$ defined on $(\Omega, \mathscr{F}, P)$, $\sigma(X)$ is the smallest $\sigma$-field to which $X$ is measurable.

  • $k$ classes of measurable sets $\mathscr{F}_i$, $i = 1, \ldots, k$ are said to be independent if for any $A_i \in \mathscr{F}_i$, $i = 1, \ldots, k$, it holds that $$P(A_1 \cap \cdots \cap A_k) = P(A_1) \cdots P(A_k).$$

Therefore by this definition, to show $Y_{n + 1}$ and $Y_1 + \cdots + Y_n$ are independent, what you need to show is that $\sigma(Y_{n + 1})$ and $\sigma(Y_1 + \cdots + Y_n)$ are independent. For simplicity, denote $Y_1 + \cdots + Y_n$ by $S_n$, and $\bigcup_{i = 1}^n \sigma(Y_i)$ by $\mathscr{G}$, we shall show that:

  • $\sigma(S_n) \subset \mathscr{G}$.

  • $\mathscr{G}$ and $\sigma(Y_{n + 1})$ are independent.

To show the first claim, since each $Y_i$ is $\sigma(Y_i)$-measurable and $\sigma(Y_i) \subset \mathscr{G}$, $Y_i$ is $\mathscr{G}$-measurable. Since the measurability is closed under summation, it follows that $S_n$ is $\mathscr{G}$-measurable. By the minimality of $\sigma(S_n)$, we conclude that $\sigma(S_n) \subset \mathscr{G}$.

For each $A \in \mathscr{G}$, $B \in \sigma(Y_{n + 1})$, there exists $i \in \{1, \ldots, n\}$ such that $A \in \sigma(Y_i)$. Since $\sigma(Y_{n + 1})$ and $\sigma(Y_i)$ are independent, we have $P(A \cap B) = P(A)P(B)$. It follows by the explanation $2$ above that $\sigma(Y_{n + 1})$ and $\mathscr{G}$ are independent.

Based on these two claims you can conclude that $S_n$ and $Y_{n + 1}$ are independent.