Expected value of $|X-\mu|$ without measure/integration theory?

83 Views Asked by At

Let be $X$ a normally distributed random variable with mean $\mu$ and variance $\sigma^2\neq 0$. Our tutor said that "it is obvious that $$ \mathbb{E}(|X-\mu|)=\int\limits_{-\infty}^{\infty}|x-\mu|\frac{1}{\sqrt{2\pi \sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}~\!dx=\int\limits_{-\infty}^{\infty}\frac{|t|}{\sqrt{2\pi \sigma^2}}e^{-\frac{t^2}{2\sigma^2}}~\!dt, \text{ where }t:=x-\mu~~~~~(1) $$

and so $\mathbb{E}(|X-\mu|)=\mathbb{E}(|Y|)$", where $Y\sim\mathcal{N}(0,\sigma^2)$.


We have no theorem or lemma that shows euqation $(1)$ nor do we have already attended a measure theory course. Doing a quick search (see $E(X)= \int_{x \in \mathbb{R}}{x \cdot f_X(x) dx}$ implies $Eg(X)= \int_{x \in \mathbb{R}}{g(x) \cdot f_X(x) dx}$ on continuous random variables) shows that this claim can indeed be shown by using some theorems that are discussed in measure/integration theory courses. So I think it's no big deal if one looks at $(1)$ with that kind of background.

Intuitively $(1)$ is clear as I can use the discrete case as a fall back; but the proof in the discrete case can't be applied here. I am wondering if without notions and bacground of measure theory I can still prove $(1)$?

1

There are 1 best solutions below

0
On BEST ANSWER

Here is my take:

  • The first identity in (1) is the law of the unconscious statistician, as rubikscube09 pointed out in the comments, though it is sometimes taken as "the definition" of expectation in more applied sciences such as physics. But it is good to know that it has a solid mathematical foundation and that it is non-trivial, though very intuitive. For a rigorous definition of expectation I am afraid you require some probability theory (which is based on measure theory), so if you don't want to take the above as an "intuitive definition", you will require some background.
  • The second identity in (1) is integration by substitution (also called the change of variables formula) with the substitution you wrote down: $t := x-\mu$. This is a basic result from calculus and does not require any measure theory or probability theory.
  • The conclusion $\mathbb{E}(|X-\mu|)=\mathbb{E}(|X|)$ is highly incorrect, it should be $\mathbb{E}(|X-\mu|)=\mathbb{E}(|Y|)$, where $Y \sim \mathcal{N}(0,\sigma^{2})$ is a normally distributed random variable with mean zero and the same variance $\sigma^{2}$ as $X$, as can be seen from the formula (no shift by $\mu$ in the density of the second integrand). $\mathbb{E}(|X|)$ would have been $$ \mathbb{E}(|X|) = \int\limits_{-\infty}^{\infty}\frac{|t|}{\sqrt{2\pi \sigma^2}}e^{-\frac{(t-\mu)^2}{2\sigma^2}}~\!dt, $$ so that you can see the difference.

Hope this helps.