I'm reading a book at the moment and it has a theorem that states this:
if $X \sim N(\mu,\sigma^2)$ then
$E(X) = \mu$
My question is, when would it not equal $\mu$
I thought the expected value is the mean of any distribution, not just the normal.
Is this rule onle the case for the normal distribution, all symmetric distributions or all distributions?
Thanks
The expected value $ E(X) = \mu$ is often connected to the parameters of the distribution for instance
$$ X \sim Bin(n,p), E(X) = np$$
$$ X \sim Geom(p) , E(X) = p $$
$$ X \sim NegBin(r,p) , E(X) = \frac{r}{p}$$ $$ X \sim Pois(\lambda) , E(X) = \lambda $$ $$ X \sim N(\mu, \sigma^{2}) , E(X) = \mu $$ $$ X \sim Gamma(\alpha,\beta) ,E(X) = \frac{\alpha}{\beta}$$ $$ X \sim U(a,b) ,E(X) = \frac{b+a}{2} $$
$$ X \sim Beta(\alpha,\beta) ,E(X) = \frac{\alpha}{\alpha +\beta} $$
So $ E(X) = \mu$ is a function of them simply because of how the expected value is defined..
$$ E(X) = \int_{-\infty}^{\infty} xf(x) dx$$
where $f(x) $ is our density function, for continous functions
for discrete random variables we have $$ E(X) = \sum_{i} x p(x_{i}) $$
It isn't surprising why we get the parameters of the distribution in our expectation because of this. For instance. Supppose $ X \sim U(a,b) $. It's pdf is given by.
$$ f(x) =\begin{align}\begin{cases} \frac{1}{b-a} & \textrm{ for } a \leq x \leq b \\ \\ 0 & \textrm{ for } x <a\textrm{ or } x >b \end{cases} \end{align}$$
$$ E(X) = \int_{-\infty}^{\infty} x \frac{1}{b-a}dx $$
$$ E(X) = \frac{1}{b-a}\int_{-\infty}^{\infty} x dx $$ $$ E(X) = \frac{1}{b-a}\int_{a}^{b} x dx $$ $$ E(X) = \frac{1}{b-a} \frac{x^{2}}{2}\Big|_{a}^{b}$$ $$ E(X) = \frac{1}{2(b-a)} b^{2}-a^{2}$$ $$ E(X) = \frac{1}{2(b-a)} (b-a)(b+a)$$ $$ E(X) = \frac{b+a}{2} $$