In a book I was reading, it seemed to imply that $E(X-E(X))=0$. My intuition tells me this is true, because if $E(X)$ is the "centre", then the average displacement from this centre should be 0. However, can someone show me a formal proof (assuming it is true)?
Does $E(X-E(X))=0$?
18.6k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 5 best solutions below
On
Expectations are additive: $$E(A+B) = E(A) + E(B)$$
So in particular, $$\begin{align}E(X-E(X)) &= E(X) - E(E(X))\\ & = E(X) - E(X)\\ &= 0.\end{align}$$ where the second line follows because $E(X)$ is a constant and $E(C) = C$ for constants.
On
$$ \mathbb{E}(X-\mathbb{E}(X)) = \mathbb{E}(X) - \mathbb{E}(\mathbb{E}(X)) = \mathbb{E}(X) - \mathbb{E}(X) = 0.$$
Or even $$ \mathbb{E}(X-\mathbb{E}(X))^2 = \mathbb{E}((X-\mathbb{E}(X)^2) - \mbox{Var}(X-\mathbb{E}(X)) = \mbox{Var}(X) - \mbox{Var}(X) = 0. $$
On
Yes, it's true in general by linearity of expectations.
But as an illustration, consider the discrete, finite case with $n$ values: $X=\{x_1,x_2,\dots,x_n\}$ each occurring with equal probability.
Then
$$ E(X)=\dfrac{1}{n} \sum_{i=1}^{n} x_i $$
and
$$ E(X-E(X))=\dfrac{1}{n}\sum_{j=1}^{n}\left(x_j-\dfrac{1}{n}\sum_{i=1}^{n}x_i \right)=\dfrac{1}{n}\left(\sum_{j=1}^{n}x_j - \sum_{i=1}^n x_i\right)=0 $$
In the last step we have moved the sum over $i$ outside the sum over $j$, because the former does not depend on $j$. We need to multiply the $\sum_i$ term by $n$, however, as $\sum_j$ has $n$ terms. This $n$ cancels out with the $1/n$ associated with the $\sum_i$ term.
On
More Generally, note that for a random variable (r.v.) X can have a discrete distribution with probabilities that X can take a certain value x is P(X=x) (the probability mass function), a continuous distribution, where X has a continuous probability density function f(x) (or a mixture for a set of domains). If X is wholly discrete f(x) is zero for all x, likewise a fully continuous r.v. means P(X=x) = 0 for all x.
$$E(X) = \sum_{x=0}^\infty xP(X=x) + \int_0^\infty xf(x) \;\mathrm{dx}$$
Which means
$$E(X-E(X)) = \sum_{x=0}^\infty \left(x- \left[\sum_{x=0}^\infty xP(X=x) + \int_0^\infty xf(x) \;\mathrm{dx}\right]\right)P(X=x) + \int_0^\infty \left(x-\left[\sum_{x=0}^\infty xP(X=x) + \int_0^\infty xf(x) \;\mathrm{dx}\right]\right)f(x) \;\mathrm{dx}$$
This may seems like overkill BUT you can then just follow the algebra without quoting presumed theorems (linearity of Expectation etc) - which can easily be derived anyway...
It is true by the linearity of expectation:
$$E(X - E(X)) = E(X) - E(E(X))$$
Since $E(X)$ is a constant $E(E(X))$ is just $E(X)$ and therefore $E(X) - E(E(X)) = 0$