Proof of Expected Value for Independent random variables with two possibilities

4.5k Views Asked by At

Let ${B_i}$ ($i=$natural number) be a random variable which takes $1$ with probability $p$ and takes $-1$ with probability $1-p$ where $0<p<1$. Assume that ${B_i}$ and ${B_j}$ are independent if $i \neq j$. Let ${S_N}=\sum_{i=1}^{N} {B_i}$ ($N=$Natural Number) and $E[.]$ denote expectation.

Show that $E\left[{B_i}^m{B_j}^n\right]=E\left[{B_i}^m\right]E\left[{B_j}^n\right]$ for any natural numbers $m$ and $n$ if $i\neq j$.

I am really new to this subject, so I hope someone experienced can help show me how I should prove this.

2

There are 2 best solutions below

2
On BEST ANSWER

If two random variables $X$ and $Y$ are independent then $E(XY)=E(X)E(Y)$. This result does not depend on the type of random variables.

This is because $$E(XY)=\sum_{x,y}xyP(X=x,Y=y)=\sum_{x,y}xyP(X=x)P(Y=y)\\=\Big(\sum_xxP(X=x)\Big)\Big(\sum_yyP(Y=y)\Big),$$ which uses independence to go between the second and third expressions.

For sums, much more is true: $E(X+Y)=E(X)+E(Y)$ even if the variables are not independent.

The $n$ and $m$ in your question are a distraction. $X_i=\pm1$ so if $m$ is odd then $X_i^m\equiv X_i$, and if $m$ is even then $X^m\equiv E(X^m)=1$.

4
On

In probability theory, two random variables are said to independent when joint distribution$ F_{A,B}(a, b) = F_A(a)F_B(b)$. When densities exist then $f_{A,B}(a, b) = f_A(a)f_B(b)$

In your case $E[B^m_i, B^n_j] = \sum_{b_i,b_j \in \{\pm1\}^2} b^m_ib^n_j f_{B_i, B_j}(b_i,b_j) = \sum_{b_i, b_j \in \{\pm1\}^2} b^m_ib^n_j f_{B_i}(b_i)f_{B_j}(b_j)$

$=\sum_{b_i \in\{\pm1\}} b^m_i f_{B_i}(b_i)\sum_{b_j \in \{\pm1\}}b^n_j f_{B_j}(b_j) = E(B^m_i)E(B^n_j)$