Distribution of a binomial variable squared

17.7k Views Asked by At

If I know $X$ is a binomial random variable, how can I find the distribution of $X$ squared (I know that $P(Y=y=x^2) = p(X=x)$ but does this distribution have a standard name)? In particular, how can I find its expected value?

Thanks!

EDIT:

Thank you all! (-:

6

There are 6 best solutions below

0
On BEST ANSWER

Usually we derive the variance of the binomial distribution from the calculation of its second moment, so to refer to the variance in order to get the second moment would be somewhat circular reasoning.

The direct calculation is as follows. Consider $$\begin{align*} {\rm E}[X(X-1)] &= \sum_{k=0}^n k(k-1) \binom{n}{k} p^k (1-p)^{n-k} \\ &= \sum_{k=2}^n \frac{n(n-1)(n-2)!}{(k-2)!((n-2)-(k-2))!} p^2 p^{k-2} (1-p)^{(n-2)-(k-2)} \\ &= n(n-1)p^2 \sum_{k'=0}^{n-2} \frac{(n-2)!}{(k')!((n-2)-k')!} p^{k'} (1-p)^{(n-2)-k'} \\ &= n(n-1)p^2 \sum_{k'=0}^{n-2} \binom{n-2}{k'} p^{k'} (1-p)^{(n-2)-k'} \\ &= n(n-1)p^2, \end{align*}$$ because the summand in the penultimate step is simply the probability mass function of a binomial random variable with parameters $n-2$ and $p$, so it sums to $1$ (which is also evident via the binomial theorem). Indeed, this calculation is easily generalized: $${\rm E}[X(X-1)\cdots(X-m)] = \frac{n!}{(n-m-1)!}p^{m+1},$$ for which the above is the special case $m = 1$. For $m = 0$, we easily get ${\rm E}[X] = np$. Then combining these results via the linearity of expectation gives $${\rm E}[X^2] = {\rm E}[X(X-1) + X] = {\rm E}[X(X-1)] + {\rm E}[X] = n(n-1)p^2 + np.$$

0
On

Hint: $E[X^2] = E[X]^2+\text{Var}[X]$. Do you know the mean and variance of $X$?

EDIT: Since a few other answers have shown a derivation of the mean and variance of a binomial random variable, I'll show one as well.

If $X \sim \text{Binomial}(n,p)$, then we can write $X = \displaystyle\sum_{k = 1}^{n}Y_i$ where $Y_i$ are i.i.d. $\text{Bernoulli}(p)$.

Each $Y_i$ takes the value $1$ with probability $p$ and $0$ with probability $1-p$.

Hence, $E[Y_i] = p \cdot 1 + (1-p) \cdot 0 = p$ and $\text{Var}[Y_i] = p \cdot (1-p)^2 + (1-p) \cdot (0-p)^2 = p(1-p)$.

By linearity of expectation, $E[X] = np$. Since the $Y_i$'s are independent, $\text{Var}[X] = np(1-p)$.

Therefore, $E[X^2] = E[X]^2+\text{Var}[X] = (np)^2+np(1-p)$.

0
On

Hint: You can use the fact that $$ Var(X)=E(X^{2})-E(X)^{2} $$

to find $E(X^{2})$. This is assuming you know both the mean and variance of a binomial random variable

0
On

In addition to Vinay's comment on finding $\mathbf{E}X^2$, you can easily find the distribution of $X^2$ by squaring the values that $X$ takes (i.e $0,1, 4, 9, \ldots$).

0
On

Consider $f(s) =E s^X$.

$$ f(s) = \sum_{k=0}^n \binom nk s^kp^k(1-p)^{n-k} = (1-p+sp)^n \\ EX^2 = \sum_{k=0}^n \binom nk k^2 p^k(1-p)^{n-k} = f'(1) + f''(1) = np + n(n-1)p^2 $$

3
On

$\begin{align} \text{E}(X^2) & = \sum\limits_{k = 0}^n k^2P[X = k]\\ & = \sum\limits_{k = 0}^n k^2\,^nC_kp^kq^{n - k}\\ & = \sum\limits_{k = 1}^n k^2\, \dfrac{n!}{k! (n - k)!} p^kq^{n - k}\\ & = \sum\limits_{k = 1}^n k \dfrac{n!}{(k - 1)! (n - k)!} p^kq^{n - k}\\ & = \sum\limits_{k = 1}^n (k - 1 + 1) \dfrac{n!}{(k - 1)! (n - k)!} p^kq^{n - k}\\ & = \sum\limits_{k = 1}^n (k - 1) \dfrac{n!}{(k - 1)! (n - k)!} p^kq^{n - k} + \sum\limits_{k = 1}^n \dfrac{n!}{(k - 1)! (n - k)!} p^kq^{n - k}\\ & = \sum\limits_{k = 2}^n \dfrac{n!}{(k - 2)! (n - k)!} p^kq^{n - k} + \sum\limits_{k = 1}^n \dfrac{n!}{(k - 1)! (n - k)!} p^kq^{n - k}\\ & = n(n-1)p^2\sum\limits_{k = 2}^n \dfrac{(n-2)!}{(k - 2)! (n - k)!} p^{k-2}q^{(n-2) - (k-2)} +\\& \qquad np\sum\limits_{k = 1}^n \dfrac{(n-1)!}{(k - 1)! (n - k)!} p^{k-1}q^{(n-1) - (k-1)}\\ & = n(n-1)p^2\sum\limits_{k=2}^n \, ^{n-2}C_{k-2}p^{k-2}q^{(n-2) - (k-2)} + np\sum\limits_{k = 1}^n \, ^{n-1}C_{k-1} p^{k-1}q^{(n-1) - (k-1)}\\ & = n(n - 1)p^2(p + q)^{n - 2} + np(p + q)^{n - 1}\\ & = n^2p^2 - np^2 + np\\ & = \boxed{n^2p^2 + npq} \end{align}$