I want to clear some confusion I have of the expected value.
The definition of the expected value for the finite case is given as follows:
Let $X$ be a random variable with a finite number of finite outcomes $x_1, x_2, \ldots, x_k$ occurring with probabilities $p_1, p_2, \ldots, p_k,$ respectively. The expectation of $X$ is defined as
$$\operatorname{E}[X] =\sum_{i=1}^k x_i\,p_i=x_1p_1 + x_2p_2 + \cdots + x_kp_k.$$
The definition of the expected value for the countably infinite case is given as follows:
Let $X$ be a non-negative random variable with a countable set of outcomes $x_1, x_2, \ldots,$ occurring with probabilities $p_1, p_2, \ldots,$ respectively. Analogous to the discrete case, the expected value of $X$ is then defined as the series
The definition of the expected value for the absolutely continuous case is given as follows:
If $X$ is a random variable whose cumulative distribution function admits a probability density function density $f(x)$, then the expected value is defined as the following Lebesgue integral, if the integral exists:
$$\operatorname{E}[X] = \int_{R} x f(x)\, dx.$$
In my last question, Eevee Trainer kindly answered my question by stating that
$$\text{Var}(N) = \underbrace{\sum_{N=0}^4 N^2 \cdot \Bbb P(N)}_{\Bbb E(N^2)} - \underbrace{\left( \sum_{N=0}^4 N \cdot \Bbb P(N) \right)^2}_{\Bbb E(N)^2} = \frac 1 5 \sum_{N=0}^4 N^2 - 4$$
Now, here's where I've had persistent confusion about the definition/application of expected value. In terms of understanding the theory, I don't understand why we have $\sum_{N=0}^4 N^2 \cdot P(N)$ instead of $\sum_{N=0}^4 N^2 \cdot P(N^2)$ in the aforementioned case. Yes, I can tell that the latter doesn't make sense from a computation standpoint, but I don't actually understand why it's correct to do it the way it is rather than the latter way.
I would greatly appreciate it if people could please take the time to clarify this.
Suppose $X = \begin{cases} \phantom{+}1 \\ \phantom{+}0 & \text{each with probability }1/3. \\ -1 \end{cases}$
\begin{align} \textbf{right: } & \operatorname E(X^2) = 0\cdot\Pr(X^2 = 0) + 1\cdot \Pr(X^2=1) \\[8pt] \textbf{right: } & \operatorname E(X^2) = 0^2 \cdot\Pr(X=0) + 1^2 \cdot \Pr(X=1) + (-1)^2\cdot\Pr(X=-1) \\[8pt] \textbf{wrong: } & \operatorname E(X^2) = 0^2\cdot\Pr(X^2=0^2) + 1^2 \cdot\Pr(X^2 = 1^2) + (-1)^2\cdot\Pr(X^2=(-1)^2) \end{align}