Proof of Expectation of a Function of a Random Variable

777 Views Asked by At

$$ E(aX + b) = aE(X) + b $$

$$ E(aX + b) = \sum_x(ax+b)p_X(x) = a\sum_xp_X(x) + \sum_xbp_X(x) = aE(X) + b$$

Could someone extrapolate on this proof a little, I'm a little confused why the constant b is included in a sum given it's not multiplied by the rv X. Or is that purely a formalism when dealing with an expectation.

1

There are 1 best solutions below

0
On BEST ANSWER

Taking it one step at a time.

$$\begin{align} \mathsf E(aX + b) &= \sum_x(ax+b)p_X(x) &&{\text{definition of expectation}\\ \text{/ law of unconscious statistician}} \\[1ex] & = \sum_x \big(ax\,p_X(x) + b\,p_X(x)\big) && \text{distribution} \\[1ex] & = \sum_x ax\,p_X(x) ~+~ \sum_x b\,p_X(x) && \text{association} \\[1ex] & = a\sum_x x\,p_X(x)~+~ b\sum_x p_X(x) && \text{distribution} \\[1ex] & = a\sum_x x\,p_X(x)~+~ b && \text{law of total probability} \\[1ex] & = a\,\mathsf E(X) + b && \text{definition of expectation} \\[2ex]\therefore\quad\mathsf E(aX+b) ~&=~ a~\mathsf E(X)+b && \text{quad erat demonstrandum } \end{align}$$

$\blacksquare$

I'm a little confused why the constant b is included in a sum given it's not multiplied by the rv X.

The step is just associating the series into two so that you can treat the terms separately.   Then by distributing the constants out you can see that $b$ is multiplied by the total probability; demonstrating that $\mathsf E(b) = b$.   That was a step your proof skimmed over; likely feeling that it was obvious.