Proving the properties of mutually independent random variables

125 Views Asked by At

Question

I already answered part (a). It was straight forward that the answer to that is $$f_{X_1...X_n}(x_1...x_n)=f_{X_1}(x_1)f_{X_2}(x_2)...f_{X_n}(x_n)$$ assuming mutual independence.

Unfortunately, it is part (b) where I become confused. I know that it holds true because we have learned about the properties of expectations when it comes to independent random variables previously, but how can I $show$ it. In other words, can somebody assist me in breaking it down. I always want to make sure that I have my fundamentals in tact before proceeding to more complicated problems.

2

There are 2 best solutions below

0
On BEST ANSWER

According to the law of the unconscious statistician and based on the fact the r.v. are iid, we have that \begin{align*} \textbf{E}(f_{X}(x_{1},x_{2},\ldots,x_{n})) & = \int_{\mathbb{R}^{n}}f_{X}(x_{1},x_{2},\ldots,x_{n})f_{X}(x_{1},x_{2},\ldots,x_{n})\mathrm{d}x_{1}\ldots\mathrm{d}x_{n}\\\\\ & = \int_{\mathbb{R}^{n}}[f_{X_{1}}(x_{1})\ldots f_{X_{n}}(x_{n})][f_{X_{1}}(x_{1})\ldots f_{X_{n}}(x_{n})]\mathrm{d}x_{1}\ldots\mathrm{d}x_{n}\\\\ & = \int_{\mathbb{R}}[f_{X_{1}}(x_{1})]^{2}\mathrm{d}x_{1}\int_{\mathbb{R}}[f_{X_{2}}(x_{2})]^{2}\mathrm{d}x_{2}\times\ldots\times\int_{\mathbb{R}}[f_{X_{n}}(x_{n})]^{2}\mathrm{d}x_{n}\\\\ & = \textbf{E}(f_{X_{1}}(x_{1}))\times\textbf{E}(f_{X_{2}}(x_{2}))\times\ldots\times\textbf{E}(f_{X_{n}}(x_{n}))\\\\ & = [\textbf{E}(f_{X_{1}}(x_{1}))]^{n} \end{align*}

and we are done.

Hopefully this helps!

0
On

This question already has an answer, but I want to provide a much quicker solution: Since $f$ is a density, $f$ is measurable. Consequently, if $X_1,X_2,\dots, X_n$ are independent, then $f(X_1), f(X_2), \dots, f(X_n)$ are also independent (see c.f. https://stats.stackexchange.com/questions/94872/functions-of-independent-random-variables). Thus, $$\mathbb E[f(X_1, X_2, \dots, X_n)] = \mathbb E[f(X_1)f(X_2)\cdots f(X_n)] = \mathbb E[f(X_1)]^n,$$ where a) was used to establish the first equality, and the fact that independent random variables are also uncorrelated (that is, the expectation of a product becomes the product of expectations) was used for the second equality.