Probability: Properties of a Binomial Random Variables

58 Views Asked by At

hi there i've looked through the forums and i've not really been able to find the clarification i need if someone could (by happens chance of knowing about it specifically) can point me in the right direction that'd be great otherwise anyone that can clear up my issue would also be much appreciated.

Section 4.6.1 Properties of Binomial Random Variables Page 139 of A first course in probability.

(paraphrasing) we're concerned with the properties of $X$~$B(n,p)$, to begin we compute it's expected value and variance. now: $$E[X^{k}]=\sum_{i=0}^{n}i^{k}{{n}\choose{i}}p^{i}(1-p)^{n-i}$$ i understand this to be $g(X)=X^k$ then calculating the expectation as $E[g(X)]=\sum_{i}g(x_i)p(x_i)$ $$=\sum_{i=1}^{n}i^{k}{{n}\choose{i}}p^{i}(1-p)^{n-i}$$ $$E[X^{k}]=\sum_{i=0}^{n}i\cdot i^{k-1}{{n}\choose{i}}p^{i}(1-p)^{n-i}$$ using the identity $$i{{n}\choose{i}}=n{{n-1}\choose{i-1}}$$ gives $$=np\sum_{i=1}^{n}i^{k-1}{{n-1}\choose{i-1}}p^{i-1}(1-p)^{n-i}$$ we then substitute $j=i-1$ into the above $$=np\sum_{j=0}^{n-1}(j+1)^{k-1}{{n-1}\choose{j}}p^{j}(1-p)^{n-j-1}$$ i'm fine untill here, the above seems straight forward $$=npE[(Y+1)^{k-1}]$$ where $Y$~$B(n-1,p)$

so i have two small hiccups, both of which i'm pretty certain are small issues which youre all going to call me a plonker for.

first after the $j=i-1$ substitution the change in boundries for the summation is because for all values that i takes j takes 1 less..so running 1 and n gives the two above right?

the second is: in the final form

$$E[X^K]=npE[(Y+1)^{k-1}]$$ why is it $Y+1$ i dont understand the reasoning behind it.

Thanks for the help in advanced.

2

There are 2 best solutions below

3
On

To the first question, of course that's why, you plonker! (What's a plonker?)

To the second question, we have $$ E((Y+1)^{k-1}) = \sum_{j=0}^\infty (j+1)^{k+1}P(Y=j)$$ and we can see that the equation derived is of this form when you substitute for $P(Y=j)$ the rule for $Y\sim Bin(n-1,p)$ $$ P(Y=j) = {n-1\choose j}p^{j}(1-p)^{n-1-j}$$ for $0\le j \le n-1$ (zero otherwise).

This is just a specific case of the more general $$ E(f(Y)) = \sum_{j=0}^\infty f(j)P(Y=j) $$ for when $Y$ is a discrete RV with support $0,1,2,3\ldots$

1
On

Does it help you see the pattern if you use the literals $x,y$ as indices for the series rather than $i,j$?

You have established that where $X\sim\mathcal{Bin}(n,p), Y\sim\mathcal{Bin}(n-1, p)$ we have: $$\begin{align}\because\quad\sum_{x=0}^n x^k \binom{n}{x}p^x(1-p)^{n-x} ~&=~ np\sum_{y=0}^{(n-1)} (y+1)^{(k-1)}\binom {(n-1)}{y}p^y(1-p)^{(n-1)-y} \\[2ex] \sum_{x=0}^n x^k f_X(x)\qquad\qquad\quad&=~ np\,\sum_{y=0}^{(n-1)} (y+1)^{(k-1)}f_Y(y) \\[2ex]\therefore\quad\mathsf E\big(X^k\big) \qquad\qquad\qquad\quad&=~ np\,\mathsf E\big((Y+1)^{(k-1)}\big) \end{align}$$

Where $f_X, f_Y$ are the probability mass functions of the respective binomial distributions.

(Note: $0^k=0$ so the term at $x=0$ vanishes, which is why it is usually ommitted in the series for the LHS.   This doesn't happen on the RHS since $(0+1)^{k-1}\neq 0$)


In full: $${\begin{align} \tag 1 \mathsf E(X^k) ~&=~\sum_{x=0}^n x^k \binom{n}{x} p^x(1-p)^{n-x} \\[0.5ex]\tag 2 &=~ 0+\sum_{x=1}^n x^k \binom{n}{x} p^x(1-p)^{n-x} \\[0.5ex]\tag 3 &=~ \sum_{x=1}^{n} x^{k-1}\cdot x\binom n x p^x(1-p)^{n-x} \\[0.5ex]\tag 4 &=~ \sum_{x=1}^{n} x^{k-1}\cdot n\binom {n-1}{x-1} p^x(1-p)^{n-x} \\[0.5ex]\tag 5 &=~ \sum_{y+1=1}^{n} (y+1)^{k-1}\cdot n\binom {n-1}{(y+1)-1} p^{y+1}(1-p)^{n-(y+1)} \\[0.5ex]\tag 6 &=~ \sum_{y=0}^{n-1} (y+1)^{k-1}\cdot n\binom {n-1}{y} p\cdot p^y(1-p)^{n-1-y} \\[0.5ex]\tag 7 &=~ np\,\sum_{y=0}^{n-1} (y+1)^{k-1}\binom {n-1}{y} p^y(1-p)^{(n-1)-y} \\[2ex]\tag 8 \mathsf E(X^k) ~&=~ np\,\mathsf E((Y+1)^{k-1}) \end{align}\\\Box}$$