Why are these claims true?

55 Views Asked by At

I'm reading a textbook that makes these claims:

(1) Given a random variable $X$ that is $1/N$ times the sum of $Np$ independent random numbers, each of which takes on the values $1$ or $-1$, $X$ has a binomial distribution with mean $0$ and variance $\sigma^2 = p/N$.

(2) Assuming $Np$ is large, the distribution of $X$ can be approximated by a Gaussian with the same mean and variance.

Can someone help me understand why these claims are valid?

2

There are 2 best solutions below

1
On BEST ANSWER

(1) Just follows from grinding through the computation: Let $X={1 \over N} \sum_{k=1}^{Np} X_k$. Linearity gives $EX = {1 \over N} \sum_{k=1}^{Np} E X_k = 0$. Since the $X_k$ are iid. we have $E X^2 = { 1\over N^2} \sum_i \sum_j E [X_i X_j] = { 1\over N^2} \sum_k E X_k^2 = { 1\over N^2} Np E X_1^2 = { p\over N}$.

Note that $X$is not binomially distributed. However, assuming that $Np$ is an integer, it is not hard to check that ${1 \over 2}(NX+Np) $ is the sum of $Np$ iid. variables that take values $0,1$ with equal probability. Hence ${1 \over 2}(NX+Np) $ is $\sim B(Np,{1 \over 2})$.

(2) If $Np$ is an integer, note that $S_{Np} = {X \over p} = {1 \over Np} (W_1+ \cdots + W_{Np})$, where the $W_k$ are iid. with $\mu=E W_k = 0$, $\sigma^2 = \operatorname{var} W_k = 1$.

One of the many central limit theorems states that $P[\sqrt{m} {S_m -\mu \over \sigma} \le z ] = P[\sqrt{m} S_m \le z ]\to \Phi(z)$.

Hence for large integer values of $m=Np$ we have $P[\sqrt{Np} {X \over p} \le z ] \approx \Phi(z)$, or equivalently, $P[X \le z] \approx \phi({ z \over \sqrt{p \over N} } )$ and we see that, informally, $X$ is approximately normal with zero mean and variance ${p \over N}$.

0
On

HINTS

(1) Let $B$ denote a Bernoulli variable, then $Y = 2B-1$ is either $1$ on $-1$ with same probability of success as $B$.

(2) Central limit theorem?