Random Variable Worded Problem

48 Views Asked by At

enter image description here

I can figure out the basics to the question, that is the mean and variance of Y:

  • E(Y) = 1-2p
  • Var(Y) = 4p(1-p)

I don't understand parts (i) and (ii). I dont understand the question itself, that is what is X=Y1 + Y2 + Y3, and how to use it for the latter of the question.

  • Why is E(X) = nE(Y) and Var(X)=nVar(Y)

Can someone please explain the logic, and working out process for i and ii. How does the textbook conclude this.

2

There are 2 best solutions below

0
On BEST ANSWER

The first relationship follows from the linearity of expectation: for any random variables $Y_1, Y_2, \ldots, Y_n$, $$\mathrm{E}[Y_1 + Y_2 + \cdots + Y_n] = \mathrm{E}[Y_1] + \mathrm{E}[Y_2] + \cdots + \mathrm{E}[Y_n].$$ They need not be independent nor identically distributed.

The second relationship is true only if the variables in the sum are all mutually independent; that is to say, for each $i \ne j$, $\mathrm{E}[Y_i Y_j] = \mathrm{E}[Y_i]\mathrm{E}[Y_j].$ Then it is easy to show that $$\mathrm{Var}[X] = \mathrm{E}\left[(X - \mathrm{E}[X])^2\right] = \mathrm{E}[X^2] - \mathrm{E}[X]^2,$$ and then $$\mathrm{E}[X^2] = \mathrm{E}\left[\sum_{i=1}^n Y_i^2 + \sum_{i \ne j} Y_i Y_j\right] = \sum_{i=1}^n \mathrm{E}[Y_i^2] + \sum_{i \ne j} \mathrm{E}[Y_i Y_j],$$ by the linearity of expectation. We also have $$\mathrm{E}[X]^2 = \biggl( \sum_{i=1}^n \mathrm{E}[Y_i] \biggr)^2 = \sum_{i=1}^n \mathrm{E}[Y_i]^2 + \sum_{i \ne j} \mathrm{E}[Y_i]\mathrm{E}[Y_j].$$ But if $Y_1, Y_2, \ldots, Y_n$ are independent, then the second term in each of the above expressions are equal to each other, so their difference is $$\mathrm{E}[X^2] - \mathrm{E}[X]^2 \underset{\mathrm{ind}}{=} \sum_{i=1}^n \mathrm{E}[Y_i^2] - \mathrm{E}[Y_i]^2 = \sum_{i=1}^n \mathrm{Var}[Y_i].$$ Then if $Y_1, Y_2, \ldots, Y_n$ are also identically distributed, then they each have a common variance, so $\mathrm{Var}[X] = n \mathrm{Var}[Y_1]$ as claimed.

0
On

Just wanted to show another way of looking at this problem (ii) (even though heropup's answer is correct). To find $Var(X)=Var(\sum_{i=1}^{n}Y_{i})$ note the following properties of random variables (I will leave you to prove these individual properties)

(1) $Var(W)=Cov(W,W)$

(2) $Cov(W+V,N+M)=Cov(W,N)+Cov(W,M)+Cov(W,N)+Cov(V,M)$ (sort of like a distirbution law)

(3) $Cov(W,V)=Cov(V,W)$

(4) if W and V are independent then $Cov(W,V)=0$

With this we see that $$Var(X)=Var\left(\sum_{i=1}^{n}Y_{i}\right)=Cov\left(\sum_{i=1}^{n}Y_{i},\sum_{i=1}^{n}Y_{i}\right)=\sum_{i=1}^{n}Cov\left(Y_{i},Y_{i}\right)+2\sum_{i\neq j}Cov(Y_{i},Y_{j})$$ where the last line can be best understood on making what is called a covariance matrix (http://en.wikipedia.org/wiki/Covariance_matrix) where the diagonal is just ones when indexes are the same and since (3) the matrix is symmetric thus you can sum ones that aren't the same and multiply them by $2$. Now remember (4) implies that $Cov(Y_{i},Y_{j})=0$ for $j\neq i$ since we supposed these where independent. Now we have $$=\sum_{i=1}^{n}Cov\left(Y_{i},Y_{i}\right)=\sum_{i=1}^{n}Var(Y_{i})$$ where if $Y_{i}$'s are identical then variance will be same for all thus we have letting $Var(Y)=\sigma^{2}$ $$=n\sigma^{2}$$