Linearity of expectation is the property that the expected value of the sum of random variables is equal to the sum of their individual expected values, regardless of whether they are independent.
My understanding of random variables (both continuous and discrete) is that they assign a number to each possible outcome of a random experiment.
For example, if we roll a die, we can land on any number between 1 and 6, and we can create a random variable $X$ that takes each of those values. Here, $X$ represents each possible outcome.
The expected value of $X$ would then be
$\text{E}[X] = 3.5$
by taking the weighted sum of each of the possible outcomes. It's all good until this part. Here is what I don't understand,
What is this notion of adding two random variables? I mean they don't have distinct values, so how can we say
$\text{E}[X + X] = \text{E}[X] + \text{E}[X] = 7$
This is just the linearity of expectation applied to when two dies are rolled, we are asked to get the expected sum of the numbers on both dies. But how are we adding two random variables?
What if we wanted to get the product of the number on two dies. Or the difference or quotient?
$\text{E}[X * X] = \text{E}[X] * \text{E}[X] = 12.25$
$\text{E}[X - X] = \text{E}[X] - \text{E}[X] = 0$
$\text{E}[X / X] = \text{E}[X] / \text{E}[X] = 1$
Are all of these operations valid? I'm really confused, please help. How am I supposed to think about these random variables?
This is in connection to my other question that I will hopefully be able to make sense of.
No, $E(X^2)$ is not equal to $E(X)^2$.
Yes, $E(X-X)=E(X)-E(X)=0$. Linearity of expectation implies $E(X-Y)=E(X)-E(Y)$.
Sort of, I would say $E(X/X)=E(1)=1=E(X)/E(X)$, as long as this makes sense ($P(X=0)=0$ and $E(X)\ne0$) but this is really a coincidence as $E(X/Y)\ne E(X)/E(Y)$ in general.