$E[X_1+X_2+\cdots+X_n]=E[X_1]+E[X_2]+\cdots+E[X_n]$ Proof

292 Views Asked by At

I am trying to proof (from myself I have the case in my book for continuous random variable but want to find the proof for discrete random variables) that:

$$E[X_1+X_2+\cdots+X_n]=E[X_1]+E[X_2]+\cdots+E[X_n]$$

I came up with something but it seems to simple. I am only considering 2 random variables for the proof $X_1$ and $X_2$ and assume they have the same probability distribution (and that all probabilities are equal). Thus the generic definition for the expected value in this case is (where $N$ is the sample size):

$$E[X] = {\sum_{i=1} X_i \over N}$$

Now going back to the proof:

$$\begin{align}E[X_1 + X_2]&={\sum (X_{1i} + X_{2i}) \over N}\\[12pt] &= {\sum X_{1i} \over N }+{\sum X_{2i} \over N} \\[12pt] &=E[X_1] + E[X_2]\end{align}$$ This seems to be too simple. Would that also mean this is only true if $X_1$ and $X_2$ have the same probability distribution?

Thank you.

1

There are 1 best solutions below

4
On

$$ E[X] = \sum_x x \Pr(X=x). $$ This works if the random variable $X$ has a discrete distributions. For other distributions, one needs a more general formula.

If there are just finitely many possible values and the all have the same probability (so it's a discrete uniform distribution) then you can say $$ E[X] = \sum_{i=1}^N x_i \frac1N $$ where $N$ is the number of possible values, and this is then the same as $$ \frac{\sum_{i=1}^N x_i}{N}. $$ This works only for discrete uniform distributions.

[to be continued]