Why is $f_x(Ax + b) = f_x(x)$?

97 Views Asked by At

Let $x \in \mathbb{R}^n$ be random vector, $A \in \mathbb{R}^{m \times n}$ be a matrix and $b \in \mathbb{R}^m$ be a vector.

Now I should proof that the expected value is linear:

$$\mathbb{E}(Ax + b) = A \cdot \mathbb{E}(x) + b$$

The professor shared his proof. I understand every part of it, expcept for the first:

$$\mathbb{E}(Ax+b) = \int (Ax + b) f_x (x) \mathrm{d} x$$

Why is it not the following? Or is it the same?

$$\mathbb{E}(Ax+b) = \int (Ax + b) f_x (Ax + b) \mathrm{d} x$$

1

There are 1 best solutions below

0
On BEST ANSWER

As explained by Did, I confused some variables. The correct first step would have been

$$\mathbb{E}(Ax+b) = \int u f_{Ax+b} (u) \mathrm{d} u$$

Due to the Law of the unconscious statistician (thank you Omnomnomnom) this is equal to

$$\int (Au+b) f_{x} (u) \mathrm{d} u$$